Ss Part B

SS PART B.doc

Survey to Evaluate Occupational and Safety Educational Materials for Home Care Workers

OMB: 0920-0880

Document [doc]
Download: doc | pdf




Evaluation of Occupational Safety and Health Educational Materials for Home Care Workers

0920-10CB





Supporting Statement

PART B









Sherry Baron, MD

Project Officer

SBaron@CDC.GOV


National Institute for Occupational Safety and Health

Division of Surveillance, Hazard Evaluations, and Field Studies

4676 Columbia Parkway

Cincinnati, Ohio 45226


513-458-7159 (tel)

513-841-4489 (fax)


October 2010

B. Collection of Information Employing Statistical Methods

B1. Respondent Universe and Sampling Methods

B2. Procedures for the Collection of Information

B3. Methods to Maximize Response Rates and Deal with Nonresponse

B4. Tests of Procedures or Methods to be Undertaken

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data……….

The proposed information collection will be used to evaluate an occupational safety and health intervention developed by NIOSH targeting home care workers and their clients in Alameda County, California. The intervention is a printed home care worker handbook and a 1 hour training program on how to use the handbook (Attachment B). The major objective is to demonstrate that following the intervention the home care workers have: 1) improved self efficacy to identify and address safety and health hazards in their workplace and 2) that workers are more satisfied with their work arrangement and clients are more satisfied with their caregiver. The evaluation will use an experimental design with an intervention (Experimental) group and a control group and a pre-intervention and post-intervention survey administered to both the participating workers and their primary clients. The control population will receive their training intervention after the completion of the post survey. There will be a two month interval between the pre and post survey. The design is summarized in the table below:


Evaluation Study Design

------ Time (2 months)----

Study Group

Pre Survey


Training Intervention

for Home Care Workers

Post Survey


Training Intervention for Home Care Workers

Experimental- Home care workers

O

X

O


Experimental- Clients

O

X

O


Control- Home care workers

O


O

X

Control –Clients

O


O

X

Note: In the grid shown above, “O” indicates an observation and “X” represents the training intervention. Assignment to the experimental or control group will be randomized.


B1. Respondent Universe and Sampling Methods


Definition of the Target Population

The target population for the survey is employed home care workers working in the Alameda County In-Home Supportive Services (IHSS) program and their primary clients. This includes approximately 15,000 workers and their clients.

Data on the gender, race, and ethnic composition of the Alameda County home care worker population is available from a 2002 survey conducted on a random sample of 500 workers and showed the following demographic characteristics: 75% are persons of color (Black 43%, Asian 25%, Latino 7%), most are over 40 years old (28% 55 or older, 37% 41-54) and 80% are female. The two most common languages other than English are Spanish and Chinese [East Bay Alliance 2002]. Home care is an occupation with a high turnover rate. The random survey of 500 workers in Alameda County found that 20% had been working less than 1 year as a home care worker. Since the evaluation design is based on the assumption that the home care workers will remain working during the two month interval between the pre and post survey, the evaluation will target the more stable 80% of workers with at least 1 year of experience as a home care worker.


Descriptive statistics for the clients served by the Alameda County IHSS program shows the following portrait. Of all the clients, 4% are blind or deaf, 11% are “wheelchair bound,” 2% were “bed-bound,” 11% had a mental disability. Most consumers are classified as needing major assistance with household tasks like laundry (98%), housekeeping (98%), shopping (91%), and meal preparation (78%). Fewer needed major assistance with tasks related to personal care, such as bathing (45%), dressing (30%), and transfers within the house (17%) [Baron 2004].


Specification of Sample Selection Procedures


Home Care Worker sample selection process: From this universe of 15,000 home care workers a sample of 320 workers and their clients will be included in the study with half of each group (160) randomly assigned to the intervention group and the other half (160) to the control group. Since the three most common primary languages of the workers in Alameda County are English, Spanish and Chinese (mainly Cantonese) versions of the intervention handbook have been developed in those three languages. The sample will therefore also select equal numbers in each group of respondents whose primary language is English, Spanish or Chinese.


We will recruit participants through a mailing to a stratified random sample of 5000 current home care workers extracted from the regularly updated Alameda County IHSS program employee database (see attachment D for the text of the recruitment letters). The sample will be stratified to reflect approximately equal numbers of English, Spanish and Chinese speakers using the preferred language variable included in the employee database. The mailing will include a letter (in either English and Spanish or English and Chinese) describing the training program and the evaluation study and inviting them to volunteer to participate. It will also include a letter directed to clients explaining the study which home care workers will be encouraged to share with their clients. An interest response form and a stamped envelop addressed to the survey contractor will also be included in the mailing (see attachment D for text of the letters and response form). Both the worker and their primary client must sign the response form (Attachment D2) indicating that the homecare workers is volunteering to participate and that the client is aware of her participation.


Since we recognize that mailed surveys usually have a low response rate, even with a 10% response rate for those returning the interest response forms (Attachment D2) we should have sufficient interest to recruit the target population of 320.


Once response forms are received by the survey contractor the contractor will randomly assign each eligible participant to the intervention or the control group. Eligibility to participate will include the following: 1) current employment as a home care worker in the IHSS program, 2) reporting at least 1 year of experience working in the IHSS home care program, 3) willingness to complete two telephone surveys and to attend a training session, 4) willingness to have the worker’s primary client contacted and invited to participate in the study and 5) fluency in English, Spanish or Cantonese. While workers must be willing to have the study contact their client, if the client declines to complete a survey this will not compromise the eligibility of the home care respondents. If interest exceeds the necessary sample size, interested workers who were not selected as participants will be sent a thank-you note and a copy of the handbook at the completion of the evaluation.


Once enrolled, the home care workers will complete the worker pre- survey (Attachment C1) and two months later they will complete the worker post- survey (Attachment C2).


Client sample selection process: The Alameda County home care program will provide the survey contractor a database that includes the names and contact information for the sample of the 5000 workers to whom the recruitment materials were mailed (Attachment D) linked to the names and contact information for the client(s) served by each worker. Using the information supplied on the returned home care worker interest response forms(see Attachment D2), the survey contractor will match the name of the client on the response form to the client contact information supplied by the county. They will then contact the client by telephone to confirm that they are served by the enrolled home care worker and invite them to participate in the study. Once enrolled, the client will complete the pre- and post- survey (Attachment C3 and C4) on a schedule similar to their homecare worker.


If clients are not cognitively able to complete the surveys, or if the client declines, this will be noted and no further information will be collected from the client. A non response by the client will not affect the eligibility of the home care worker to remain in the study. Although the proportion of consumers who will be non respondents is difficult to predict, a study of client satisfaction among a random sample of about 1000 clients enrolled in the California IHSS had a response rate of 78% using a similar telephone interview format and less than 10% of respondents could not respond due to cognitive limitations [Benjamin 2000]. For this survey we will use methods developed by our consultants at Scripps Gerontology Center at Miami University to determine when a respondent should be considered cognitively unable to respond to the survey. Scripps researchers recommend that interviewers terminate the interview when the respondent is unable to respond to four consecutive questions. For this survey, if the client can not respond to the Background section of the client pre survey (First 4 questions of attachment C3), the interview will be terminated [Murdock 2005].



B2. Procedures for the Collection of Information


Following OMB approval, survey materials will be prepared. Survey materials will consist of: 1) the home care worker interest response form included in the recruitment letter(Attachment D2), 2)homecare worker pre survey(Attachment C1), 3) homecare worker post survey (Attachment C2), 4) client pre survey (Attachment C3) and 5) client post survey(Attachment C4). Also included as a survey material is the home care worker training program curriculum guide (Attachment B2); but no survey data will be collected during the training. The data collection will be completed via computer aided telephone interviewing (CATI).


All interviewers will be professionally trained and will also be required to attend an additional training session specific to the questionnaire used in the study. During the training session, interviewers will have the opportunity to ask questions and practice with the questionnaire on a CATI program. Interviewers who will be administering non English surveys will be proficient in either Spanish or Cantonese with prior experience conducting interviews by telephone in those languages. Interviewers will also be instructed in how to interview elderly respondents and how to make a determination as to whether the interview should be terminated because of cognitive or hearing limitations of the clients (see example of training manual for interviews with elderly in Murdoch, 2005). Furthermore, all interviewers will also be randomly monitored on a weekly basis by telephone for any additional issues that may require clarification or instruction.


Pre-survey publicity will be done through announcements at home care workers’ labor union meetings and in the lobby of the Alameda County IHSS administrative building where many workers go to submit time sheets (example of sample flyer in Attachment I). The mailing of the recruitment package (Attachment D) will occur during this pre-survey publicity blitz.


Once interest response forms (Attachment D2) are received by the contractor, the contractor will confirm from the information provided that the worker is eligible to participate and then they will randomly assign them to either the intervention or the control group. The contractor will then call these workers to enroll them in the study.. The contractor will attempt to call all those who return the interest response form at least 5 times at different times of day and days of the week before they are considered non responsive. Non responsive workers will be replaced by the next person who was randomly placed on the same intervention or control and language group.


Each new worker enrollee will complete the pre- survey (Attachment C1), be scheduled to participate in the training program during which they will receive the handbook. (Attachment B) and be given an appointment for the post- survey (Attachment C2) two months after the pre-survey. The intervention group will be scheduled for a training program in the month following the pre survey (Attachment C1) and the control group will be scheduled for the training after the completion of the post- survey (Attachment C2).


Once a home care worker has been enrolled in the study, the contractor will contact the workers primary client to invite them to participate in the study. The Alameda County home care program will provide the contractor a database which includes the names and contact information for each worker who was mailed a recruitment package (Attachment D) linked to the name and contact information for that worker’s client(s). The contractor will identify which client signed the interest response form (Attachment D) and using the contact information in the database provided by Alameda County the contractor will contact that client by telephone and invite them to participate in the study. A letter signed by the administrator of the Alameda County IHSS program included in the worker recruitment package should have already been shared with the client (Attachment D). However, once a worker has been successfully enrolled, the contractor will mail a duplicate copy directly to the client.


To assure a good retention rate for both the workers’ and clients’ post- survey (Attachment C2 and C4), interviewers will collect detailed information about the best times of day and days of the week to reach the participant and will request multiple telephone numbers (such as home and cell phone). To improve response rates, respondents will also receive a reminder card by mail half way through the 2 month period (Attachment J2). For the follow up post- surveys (Attachment C2 and C4), interviewers will try each respondent at least 10 times, at different times of day and on different days of the week and at alternative numbers. A mailed reminder will also be sent before a participant is considered lost to follow up (Attachment J2).


Reimbursement for completing each of the two surveys will be in the form of grocery gift cards valued at $20 for home care workers. The home care workers will also receive a $40 grocery gift card for their participation in the training program. Reimbursement for the clients will be at total of $35 ($10 after survey one and $25 after survey two), also in the form of grocery gift cards.

Once an individual volunteers to participate in the study, he/she will be provided with a verbal explanation of the study, including a description of the study, potential risks of participating, the right to terminate participation at any point in time, steps taken to protect anonymity, and how the interview information will be handled and used by the study. Participants will be reminded that they have the right to not answer any questions they are uncomfortable with or terminate their participation at any point in the interview. Any questions participants may have about study procedures or their rights as a participant will be answered at this time. Contact information will be provided should participants have any questions about the study or their rights at any point during the study. It will also be stressed that study staff will make their best effort to protect participants’ confidentiality and that no names will be recorded on questionnaires. Contact information will be provided should participants have any questions about the study or their rights at any point during the study (see Attachment I for the text of the consent).


Volunteers will also be provided printed material including the information which would normally be included in a written consent form along with their incentive payment at the training session for home care workers and by mail for the clients (consents found in Attachment I).


Sample Size Calculations


The primary outcome measure we will use for this evaluation study is whether the home care worker intervention group has a greater improvement in self efficacy as compared to the control group. Self-efficacy is defined as beliefs regarding one’s ability to successfully carry out a course of action. It has been applied to a wide variety of research including health promotion activities [Bandura 1998]. It measures both an individual’s sense of confidence to undertake and master an activity as well as how likely it is that he or she will sustain an effort in the face of obstacles.


In the present study, self-efficacy will assess the degree to which home care workers’ believe they have the skills, abilities and resources to identify and reduce potential health and safety hazards. Self-efficacy will be measured in this study using a validated scale developed at Stanford University [Heaney in press]. The scale is constructed by calculating the mean score on a 4-point-Likert-scale (range 1 to 4) and is treated as a continuous variable. In order to calculate a sample size that allows for assessing the effectiveness of the intervention in bringing about an increase of self-efficacy among homecare workers, we will compare the change in self-efficacy mean scores of the intervention group with the change in self-efficacy mean scores of the control group. The Stanford University researchers have assessed the psychometrics of their self efficacy scale on a sample of approximately 100 home care workers and these results are being used to attain a more accurate estimation of effect size for the current study(see Table 1).


The effect size index d is defined as:


d = (μμ0)/σ


where σ denotes the standard deviation in the population, μ0 denotes the mean at the pretest and μ denotes the mean at the posttest [Faul 2009].


Based on the Stanford sample, the reliability (α = .71) and validity of the self-efficacy scale was confirmed and the mean self-efficacy score was μ0 = 2.89 with a standard deviation of σ = .79. The sample size n is computed as a function of the required power level (1-β), the pre-specified significance level α, and the population effect size (d).


Previous intervention studies aiming at behavior change among participants and using self-efficacy as an outcome measure could show an increase of mean self-efficacy scores from pre to post-test around .25 (e.g. Gould 2009). Based on this increase of .25 from μ0 = 2.89 and a given σ = .79, the effect size for detecting changes in the mean from pre to post is estimated at d = .32. Previous studies have confirmed effect sizes for self-efficacy enhancing interventions around d = .30 (e.g. Jerant 2008). Based on table 1 below, to be able to detect an effect at d = .32 in the intervention group with a test power of .80 and an alpha of .05 we will need to study n = 81 homecare workers in each group.


Community-based educational interventions are prone to many confounders from the field which may diminish the effect size. Potential confounders for this study include poor client cooperation with making safety changes and difficulty finding time and materials to make the recommended changes. A more conservative approach would allow for detecting smaller effects, eg Cohen’s (1969) classification of effect sizes, where a small effect is set at d = .20 medium d = 0.5, large d = 0.8.


Problems with worker turnover and attrition between the two surveys may also necessitate a larger initial sample size. One recent study of mental health outcomes among several thousand IHSS home care workers in Los Angeles included a follow up interview six months following the initial interview and was able to achieve a retention rate of 75% and less than 1% of the home care workers were no longer working as a home care worker at the 6th month follow up call (Delp personal communication). Given the much shorter follow up period for this study (2 versus 6 months), combined with the use of self-selected volunteers who have worked at least 1 year in the program, we anticipate a retention rate of at least 80% or more across the two surveys.


Given limitations due to both confounders and attrition we ultimately chose to survey 160 homecare workers for each group, a sample size which even with some loss to retention has sufficient power to demonstrate the expected effect (d = .32) while also anticipating the impact of confounders which may compromise the anticipated effect size pushing it towards the smaller d=.20 [Cohen 1969] effect size (See Table B2-1).


While our sample size was chosen to allow us to measure a statistically significant improvement in self-efficacy for home care workers, we also considered whether this sample size would be able to detect improved satisfaction among treatment group clients. The measure we are using for client satisfaction is a slightly modified version of a well validated measure, Home Care Satisfaction Measure [Geron 2000]. This measure has been extensively field-tested and the mean satisfaction level was 77.39 (on a 100 point scale) with a standard deviation of 15.56 [Murdock 2004]. Using the sample size estimates above, this study would be able to detect between a 3 point (effect size .20) and a 5 point (effect size .32) improvement in client satisfaction. A study evaluating the impact of a new model for delivery of home care services, found that clients in the treatment group had satisfaction scores 20 points (on a 100 point scale) higher than the control group [Carlson 2007]. While a more major intervention than will be introduced in our study, this suggests that a modest 3-5 point improvement is achievable.

Table B2-1: Sample Size Calculation

Confidence level

(α)

Statistical Power

(1-)

Required sample per group (n) with d = .32

Required sample per group

(n) with d = .20

.01

.95

182

438

.90

152

367

.80

120

289

.05

.95

132

319

.90

107

259

.80

81

194

.10

.95

110

266

.90

87

211

.80

64

153

Note. Effect size calculation is based on σ = .79, Detectable mean difference: μμ0 = .25 at d = .32, μμ0 = .16 at d = .20.

Note. The program G*Power was used for sample size calculations (Erdfelder, E., Faul, F., & Buchner, A., 1996).




B3. Methods to Maximize Response Rates and Deal with Nonresponse


Methods to Maximize Response Rate


While home care workers and their clients may potentially be a challenging target population for a telephone survey, previous research has demonstrated adequate response rates. One study of similar home care workers in Los Angeles County found that the largest impediment to recruitment to a telephone survey was successfully reaching respondents by telephone, since many telephone numbers had been disconnected. However, once a home care worker was reached by telephone the participation rate was 74% (Delp 2010). This study also included a follow up interview six months following the initial interview and achieved a retention rate of 75%. Less than 1% of the home care workers were no longer working as a home care worker at the 6th month follow up call (personal interview with study investigator Dr. Linda Delp at UCLA School of Public Health Ldelp@ucla.edu).


Our use of mailed interest response forms (Attachment D) should improve response rates both by targeting current workers who have already expressed interest and by providing us with very current telephone numbers and convenient times when the workers can be reached by telephone. Lessons learned from this and other studies also suggest that turnover should not significantly impact our retention rate over the 2 month period between surveys, especially when targeting those workers with at least one year of work experience. Since our follow up period is 2 and not 6 months we would hope to achieve an even higher retention rate as compared to the 75% demonstrated in the previous survey. Use of incentive payments is important in maximizing response rates in previous studies e.g., Abreu 1999; Shettle 1999. The incentive payment rate for this study is consistent with previous studies of home care workers (eg the National Home Health Aide Survey OMB No. 0920–0298).


As with home care workers, their clients can also be challenging survey respondents. While clients tend to be easier to reach by telephone because they are more likely to be at home, cognitive or physical limitation may limit their ability to participate in telephone surveys. However, experience shows that non response due to these limitations is likely to be modest. In one study using a client satisfaction survey form similar to the one used in this study, only 6% of clients were too cognitively or physically frail to respond to the survey [Benjamin 2000 and 2001] . Some studies examining satisfaction with home care services have allowed proxy responses by family members who were not the primary caregiver but in general only a small proportion require proxy respondents. For example in one study [Weiner 2007] of satisfaction with a home care program, 17% of client responses were from proxies. Recognizing the added difficultly for the elderly we have made the client survey short and questions have been drawn from well-validated surveys designed for older populations [Geron 2000, Murdock 2004].


B4. Tests of Procedures or Methods to be Undertaken


Data Collection Forms


The questions included in the data collection forms were drawn whenever possible from large well-validated survey instruments. For the home care workers’ surveys (Attachment C1 and C2) many questions were drawn from 2 large surveys conducted by the National Center for Health Statistics which target similar worker populations: 1) nursing assistants in nursing homes (National Nursing Home Survey, OMB No. 0920–0353) and 2) home health aides (National Home and Hospice Care Survey OMB No. 0920–0298). Some questions were also drawn from the NIOSH quality of worklife survey which has been part of the General Social Survey in 2002, 2006 and 2010 but were adapted to the unique characteristics of home care workers. Both the social support and the self-efficacy scales were drawn from validated measures designed specifically for caregivers (Heaney 1991). The client survey questions (Attachment C3 and C4)were drawn from well validated measures of client satisfaction and from demographic and other questions included in surveys evaluating home care programs administered by the Scripps Gerontology Center at Miami University [Applebaum 2007, Murdock 2004]. Attachment K provides a detailed description of the source of each question.

Individuals reviewing the survey instrument include:


Robert Applebaum, PhD

Suzanne Kunkel, PhD

Scripps Gerontology Center

Miami University

Oxford, Ohio

Email: applebra@muohio.edu

kunkels@muohio.edu

Phone: (513) 529-2632



Catherine Heaney, PhD

Annekatrin Hoppe, PhD

Stanford Prevention Research Center
Stanford University School of Medicine
Hoover Pavilion, Mail Code 5705
211 Quarry Road, Room N229
Stanford, California 94305-5705

Email: Ahoppe@stanford.edu

Cheaney@standford.edu

Phone (650) 721-1542


Laura Stock, MPH

University of California, Berkeley

Labor Occupational Health Program

2223 Fulton Street
Berkeley, CA 94720-5120

Email: LStock@berkeley.edu

Phone: (510) 642-5507


Linda Ayala, MPH, Training Coordinator

Charles Calavan, Executive Director

Public Authority for IHSS of Alameda County

6955 Foothill Blvd., 3rd Floor

Oakland, CA 94605

Email: LSAyala@ac-pa4ihss.org

Phone (510) 577-3554


Susannah McDevitt, Director

Richard Soohoo, Organizer

United Longterm Care Workers Union

Service Employees International Union

440 Grand Ave., Suite 250
Oakland, CA 94610

Email:SusannahM@seiu-ultcw.org



Translation of survey instrument to Spanish and Cantonese


Translation of the questionnaires into Spanish and Chinese were done in the following manner. First translation from English to the target language and then back translation to English were done. Then a team of bilingual individuals reviewed the translation and back translation and selected the final wording. Finally these translations were field tested with 3 home care workers or consumers who were native speakers of either Spanish or Cantonese and they provided additional input.


Pilot Testing


The survey was pilot tested using telephone interviews with nine workers and clients. Survey questions asking for specific reactions to the intervention materials were piloted during the field testing the draft Handbook with 9 workers and clients.


Concerns raised during pilot testing were related to: 1) rewording of several individual questions, 2) clearer definitions of terms used in the questionnaire, and 3) changes in length and format of the questionnaire to ensure easy understanding and flow. Most frequently problems were addressed by simplifying questions, shortening questions and changing wording to be more specific to unique characteristics of the home care work tasks.



B5. Individuals Consulted on Statistical Aspects and/or Analyzing Data


NIOSH has consulted with experts in evaluation design including statistical design, at the Stanford Prevention Research Center (specializing in evaluation of worker safety and health programs) and at the Scripps Gerontology Center, Miami University (specializing in evaluation of long term care program).


The following individuals have worked as consultants on the design of the survey materials and development of study methodology, and will provide assistance with conducting the data collection phase of the survey and will provide consultation to NIOSH on statistical analysis of the study data.


Robert Applebaum, PhD

Suzanne Kunkel, PhD

Scripps Gerontology Center

Miami University

Oxford, Ohio

Email: applebra@muohio.edu

kunkels@muohio.edu

Phone: (513) 529-2632



Catherine Heaney, PhD

Annekatrin Hoppe, PhD

Stanford Prevention Research Center
Stanford University School of Medicine
Hoover Pavilion, Mail Code 5705
211 Quarry Road, Room N229
Stanford, California 94305-5705

Email: Ahoppe@stanford.edu

Cheaney@standford.edu

Phone (650) 721-1542


Laura Stock, MPH

University of California, Berkeley

Labor Occupational Health Program

2223 Fulton Street
Berkeley, CA 94720-5120

Email: LStock@berkeley.edu

Phone: (510) 642-5507


Linda Ayala, MPH, Training Coordinator

Charles Calavan, Executive Director

Public Authority for IHSS of Alameda County

6955 Foothill Blvd., 3rd Floor

Oakland, CA 94605

Email: LSAyala@ac-pa4ihss.org

Phone (510) 577-3554


Susannah McDevitt, Director

United Longterm Care Workers Union

Service Employees International Union

440 Grand Ave., Suite 250
Oakland, CA 94610

Email:SusannahM@seiu-ultcw.org


Fang Gong, PhD

Sociologist

(Formerly project research staff at NIOSH)

Ball State University

Muncie, Indiana 47306

FGong@BSU.edu

Office: (765) 285-5149




The following individuals are the project staff from NIOSH.


Sherry Baron, MD, MPH

Medical Epidemiologist

Surveillance Branch

4676 Columbia Parkway R-17

Cincinnati, OH 45226

SBaron@cdc.gov

(513) 458-7159



Rui Shen, PhD

Contract Statistician

Surveillance Branch

4676 Columbia Parkway R-17

Cincinnati, OH 45226

RShen@cdc.gov

(513) 841-4233




LITERATURE CITED



Abreu, D.A., & Winters, F. 1999. Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Applebaum R, Kunkel S, Wilson K. Transforming data into practical information: using consumer input to improve home-care services. The Gerontologist Vol. 47, 2007;No. 1, 116–122.Bandura, A (1997) Self-Efficacy: the Exercise of Control. W.H Freeman and Company, New York.

Bandura, A (1997) Self-Efficacy: the Exercise of Control. W.H Freeman and Company, New York.

Baron S, McPhaul K, Phillips S, Gershon R, Lipscomb J. Protecting home health care workers: a challenge to pandemic influenza preparedness planning. Am J Public Health. 2009 Oct;99 Suppl 2:S301-7

Baron S, Habes D. Health hazard evaluation report: HETA 2001-0139-2930, Alameda County Public Authority for In-Home Support Services, Alameda, California. NIOSH 2004. Available at: http://www.cdc.gov/niosh/hhe/reports/pdfs/2001-0139-2930.pdf

Benjamin AE, Mathias R. Work-life difference and outcomes for agency and consumer-directed home-care workers. Gerontologist 2004; 44(4):279-288.

Benjamin AE, Mathias R. Age, Consumer Direction, and Outcomes of Supportive Services at Home. Gerontologist 2001; 41(5):632-642.

Benjamin AE, Matthias R, Franke TM. Comparing Consumer-directed and Agency Models for Providing Supportive Services at Home Health Services Research 35:1, Part II (April 2000)

Brown RS, Dale SB. The Research Design and Methodological Issues for the Cash and Counseling Evaluation. Health Services Research. 2007. Volume 42, Issue 1p2 , Pages414 – 445.

Bureau of Labor Statistics [1997]. Injuries to Caregivers Working in Patients’ Homes. In: Issues in Labor Statistics, Summary 97-4. Washington, D.C.: U.S. Department of Labor, Bureau of Labor Statistics.

Carlson BL, Foster L, Dale SB, Brown R.: Effects of Cash and Counseling on personal care and well-being. Health Serv Res. 2007 Feb;42(1 Pt 2):467-87.

Cohen, J. (1969). Statistical power analysis for the behavioral sciences. New York, NY: Academic Press.

Delp L, Wallace SP, Geiger-Brown J, Muntaner C. Job stress and job satisfaction: home care workers in a consumer-directed model of care. Health Services Research. 2010; in press.

East Bay Alliance for a Sustainable Economy and Center for Labor Research and Education, University of California Berkeley. [2002]. Struggling to Provide: A Portrait of Alameda County Homecare Workers.

Faul, F, Erdfelder, E., Lang, A. & Buchner, A. (2009). G*Power 3. Retrieved from: www.psycho.uni-duesseldorf.de/abteilungen/aap/gpower3.

Geron, S. M., Smith, K., Tennstedt, S., Jette, A., Chassler, D., & Kasten, L. The home care satisfaction measure: A client-centered approach to assessing the satisfaction of frail older adults with home care services. Journal of Gerontology: Social Sciences, 2000; 55B, S259–S270.

Gould, M., Jasik, C., Lustig, R. & Gaber, A. (2009). A clinic-based nutrition intervention improves self-efficacy to change behavior in obese adolescents and parents. Journal of Adolescent Health, 44 (2), 39-40.

Heaney, C. A. (1991). Enhancing social support at the workplace: Assessing the effects of the caregiver support program. Health Education Quarterly, 18(4), 447-494.

Heaney C, Hoppe A. Caregiver self-efficacy: measurement and psychometrics (In press).

Howe C. The Impact of a large wage increase on the workforce stability of IHSS Home Care Workers in San Francisco County 2002 http://www.directcareclearinghouse.org/download/WorkforceStabilityPaper.pdf )

Howe C. Love, money or flexibility: What motivates people to work in consumer-directed home care. Gerontologist. 2008; 28 (special issue 1):40-59.

Jerant, A., Moore, M, Lorig, K. & Frank, P. (2008). Perceived control moderated the self-efficacy-enhancing effects of a chronic illness self-management intervention. Chronic Illness, 4 (3), 173-182

Murdock L, Kunkel S, Applebaum R, Straker J. Care managers as research interviewers: A test of a strategy for gathering consumer satisfaction. 2004; 23:234-246.

Murdoch L, Straker J, Brothers-McPhail D. Training Case Managers to Administer the Service Adequacy & Satisfaction Instrument (SASI) Case Manager Manual August 2005. Scripps Gerontology Center, Miami University, Oxford, OH Aavailable at http://sc.lib.muohio.edu/bitstream/handle/2374.MIA/299/RevisedCaseManagerManualSASI.pdf?sequence=1

Shettle, C., & Mooney, G. 1999. Monetary incentives in U.S. government surveys. Journal of Official Statistics.1999; 15, 231-250

Wiener JM, Anderson WL, Khatutsky G, are consumer-directed home care beneficiaries satisfied? Evidence from Washington State. The Gerontologist 2007;47(6):763–774.

File Typeapplication/msword
File TitleSupporting Statement PART B
Authorslb8
Last Modified ByThelma Elaine Sims
File Modified2010-10-29
File Created2010-10-14

© 2024 OMB.report | Privacy Policy