Information Collection Request
Formative Studies on NIOSH Communications and Publications:
NIOSH Customer Satisfaction Survey
Supporting Statement Section B
Reinstatement with Change
0920-0544
Submitted by:
Juliann C. Scholl, Ph.D.
Education and Information Division
National Institute for Occupational Safety and Health
1090 Tusculum Avenue
Mailstop C-10
Cincinnati, Ohio 45226
Phone (513) 533-8178
Fax (513) 533-8560
e-mail xhn3@cdc.gov
November 1, 2017
1. Respondent Universe and Sampling Methods
2. Procedures for the Collection of Information
3. Methods to Maximize Response Rates and Deal with No Response
4. Tests of Procedures or Methods to be Undertaken
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
References
List of Tables
Table B.1-1. Stratified Sampling of the 7 Partnering Organizations
Table B.1-2. Respondent Universe and Sample by Association
Table B.3-1. Expected Response Rate by Association
B. Collection of Information Employing Statistical Methods
1. Respondent Universe and Sampling Methods
Description of Universe: The target audience for this survey consists of occupational safety and health (OSH) professionals who either serve as (a) intermediaries who work on behalf of an organization to distribute OSH information and educational materials to other organizations or (b) employers or professional employees whose job includes (or partially includes) managing some aspect of workplace safety and health (referred to as “employer” for simplicity). The sampling frame consists of OSH professionals who belong to one or more of the five groups (or strata) of partnering organizations: AIHA (8,000 members), ACOEM (4,000 members), AAOHN (4,200 members) and ASSE (36,000 members), and a fifth group that includes three other associations—American Insurance Association (AIA, 300 members), Insurance Loss Control Association (ILCA, 100), and the National Fire Protection Association (NFPA, 60,000). Each organization will make their directories available to the research firm under contract with NIOSH. For sample selection, each of the five groups will be treated as a stratum (see Table 1.1). As an individual can belong to more than one professional group or stratum, a de-duplication operation at the frame level will take place. Individuals appearing on more than one membership list will be assigned to the group most aligned with their professional training based on their title listed on their membership address. NIOSH employees who are members of any of the partnering organizations will also be excluded from the membership listings prior to sample selection.
Table B1-1: Stratified Sampling of the 7 Partnering Organizations
AIHA |
ACOEM |
AAOHN |
ASSE |
Other (AIA, ILCA, NFPA) |
500 |
500 |
500 |
500 |
500 |
Sampling Methods: Since the opinions of each profession about the topics in the survey – namely, sources for accessing NIOSH information, the use of NIOSH information in educational/training settings and program and policy development, the frequency of use of NIOSH publications, ratings of the NIOSH information delivery system and the content of NIOSH publications, and NIOSH outreach initiatives – is of equal importance to NIOSH, the designated total sample size of 2500 will be composed of an equal number of persons (500) drawn at random from each of the strata. Equal sample sizes for each group were chosen because comparisons and estimates for subgroups were judged to be very important and equal allocation will optimize this objective (i.e. provide the greatest power). Using an equal allocation to strata will have an effect on the precision of our estimates of proportions for the five groups combined because the strata sampling fractions are unequal. Although it would be cost effective and efficient to send an email solicitation to all individuals listed as members of the partnering organizations, inclusion of all members could substantially increase burden hours and costs to respondents; therefore, stratified random sampling will be used.
Sample Size Justification: Since the analysis of the results of the questionnaire will consist of descriptive statistics, the sample size was designed to produce summary measures estimated with a specified level of precision. Most of the summary statistics will be proportions (e.g., proportion of respondents who have taken a course or attended an educational program in which NIOSH publications were used). From the results of the 2010 Customer Satisfaction Survey (CSS) – NIOSH Publications and Information Services, the majority of the percentages associated with the key findings ranged from 70 to 90 percent. Accordingly, the proposed total sample size of 2500 individuals is expected to produce 95% confidence intervals that are + 1.77% in width for a proportion of interest close to 0.70.
Table B.1-2. Respondent Universe and Sample by Association
|
AIHA |
ASSE |
ACOEM |
AAOHN |
Other† |
Total |
Population size |
8,000 |
8,000* |
4000 |
4,200 |
60,400 |
84,600 |
Number of recruiting emails |
500 |
500 |
500 |
500 |
500 |
2,500 |
Expected number of surveys completed |
300 |
300 |
300 |
300 |
300 |
1,500 |
Expected response rate (percent) |
60 |
60 |
60 |
60 |
60 |
60 |
Actual response rate in 2010 survey |
48 |
40 |
34 |
41 |
N/A |
41 |
*The segment of the membership labeled as “Certified Safety Professionals” (CSPs), who will comprise the sampling frame for this stratum.
† Members of American Insurance Association (AIA), Insurance Loss Control Association (ILCA), and National Fire Protection Association (NFPA)
In our computations we have accounted for the fact that the differential sampling fractions in the strata will result in an increase in the sampling variance relative to proportional allocation to strata. The actual width of confidence intervals for a given sample size depends upon the magnitude of the proportion and as the proportion becomes closer to 0.50, the margin of error will increase. In the computation of sample size, we have assumed that the proportion follows a normal distribution. Accordingly, the 95% confidence interval for a proportion p will be of the form: of sample size, we have assumed interval for a proportion p will be of the form:
________
p + 1.96 x (p(1-p))/n) x VIF where
p = the estimated proportion
VIF = variance inflation factor because of the use of disproportionate stratified
sampling
Given the population sizes of the five sample strata (which account for all 7 partnering organizations) and approximately equal strata variances, we estimate that the equal allocation will result in a value of 1.07 for the variance inflation factor.
Because the selection probability for an individual will depend on his or her stratum, survey weights must be developed for the estimation of the proportions for the overall population. The base weight for a respondent will be the reciprocal of his or her probability of selection. In addition the survey weight will include a nonresponse adjustment factor that will be computed with the objective of reducing the bias in the estimates due to nonresponse.
In the event that statistical comparisons are desired among subgroups of respondents, contingency table analyses will be employed. Chi-square tests will be used to determine whether the distribution of responses between one or among subgroups are significantly different. When the subgroups are defined in terms of many characteristics, logistic regression models could be used to explore differences between professional subgroups.
2. Procedures for the Collection of Information
Statistical method for stratification and sample selection: The project involves the sending of a recruiting email informing recipients of the study and inviting them to complete an online survey. The recruiting email will be sent to members randomly selected from their respective organization, which will fall into one of the 5 sample strata. As noted in Section B.1., based on the results of the 2010 NIOSH Customer Satisfaction Survey, the majority of the percentages associated with the key findings ranged from 70 to 90 percent. Thus, the proposed sample size of 2500 individuals, yielding an expected 816 completed questionnaires, is designed to produce 95% confidence intervals that are + 1.77% in width for a proportion of interest close to 0.70. In our computations we have accounted for the fact that the differential sampling fractions in the strata will result in an increase in the sampling variance relative to proportional allocation to strata, as detailed in Section B.1.
To sample organizational members, we will generate a random number for each record on a membership roster. The membership roster will be sorted according to the assigned random numbers. If a recruiting email returned undeliverable, that record will be removed from the list so no follow-up emails will be sent to that email address.
Data collection procedures: The research firm under contract with NIOSH will conduct all data collection activities for this project. Respondents will be recruited through an email and will be directed to a passcode-protected website to complete the survey. The contractor will collect the names and email addresses of members of all 7 partnering organizations and send each member a recruiting email (Attachment 4a) containing information about the survey and an invitation to participate in the study. The recruiting email will provide a randomly generated passcode with which the respondent will log in to the online survey. The email also will contain a hyperlink to the online survey. When respondents click on the hyperlink, they will see information about the survey (e.g., reason for the survey, number of questions, and length of time to complete the survey), their rights as participants, and a question asking them if they would like to participate. Respondents then will be prompted to click one of two buttons on the screen: “YES, I’d like to complete the survey” or “NO, I don’t want to complete the survey.” Respondents who click “YES” will be directed to another page where they will enter the passcode they received in the recruiting email. Respondents who click “NO” will be directed to another page with the message: “That’s okay. But before you go, we would really appreciate it if you could just answer 5 questions. Can you help us out?” Another prompt will ask them to click “YES” or “NO.” Respondents who click “YES” will be directed to the webpage with the short-version survey, and will be prompted to enter their passcode. Respondents who click “NO” will be directed to a final screen debriefing them about the survey and thanking them for their time. It is estimated that the full survey will take an average of 20 minutes to complete; the short version is estimated to take less than 5 minutes.
Navigation features will be built into the survey to make the survey easier to use. Respondents will see one question at a time displayed on the screen to reduce the information displayed at any one time. When respondents complete each item, they will hit a “SUBMIT” button that brings up a new screen with the next question. At the bottom will be displayed “BACK” and “NEXT” buttons, and an “EXIT THE SURVEY” button. When respondents click the “EXIT” button, they will see a screen with the message: “Thank you for your time, but before you go, could you just answer _ more questions?” The number of questions they will be asked to answer are those from the short-version survey that they did not yet complete (ranging from 1-5 questions). Again, they will be given the option to click “YES” or
“NO,” after which they either will be directed to the remaining questions or be directed to a screen that gives them debriefing information and thanks them for their time.
A data management system generated by the contractor will be used to track respondents who complete or decline to complete the survey. Individuals who respond to the recruiting email—whether or not they ultimately complete the survey—will not receive any follow-up emails. Individuals who do not respond to the recruiting email (i.e., do not act on it by clicking the hyperlink contained in the message) will receive the first follow-up email (Attachment 4b) and will be given a friendly reminder about the study. They also will be invited to participate in the study and given the same instructions for logging onto the survey as in the recruiting email. Individuals who respond to this first follow-up email and complete or decline to complete the survey will be tracked. Those who do not respond to the first follow-up email (i.e., take no action) will be sent one more follow-up email message (Attachment 4c) and will be asked for their participation. These individuals will be tracked for their participation, refusal to participate, or lack of response. This tracking information will be kept secure and will be shared with respondents’ respective organizations or anyone else outside the scope of the study. All individuals included in this study will be tracked with the randomly-generated passcode given to them in their recruiting email. This passcode will not be linked to a specific survey response. Throughout the data collection period, contractor staff will be available by phone via the toll-free study phone number to any participants who need assistance or have concerns that need to be addressed.
Quality control procedures: Quality control procedures will be incorporated into the data processing system through the use of TeleForm®, a system which provides advanced optical character recognition (OCR) and throughput capabilities with accuracy rates approaching 100%. The contractor will develop robust forms that include range checks, skip pattern checks, and cross checks that are applied when scanning and verifying forms. In addition, Visual Basic scripts will be included in the web e-pdfs to incorporate quality assurance/quality control (QA/QC) features.
3. Methods to Maximize Response Rates and Deal with No Response
Based on the results of the previous two surveys and considering protocol improvements designed to improve response rates, it is expected that 1,500 out of 2,500 total sampled (60%) will complete a survey within the data collection period. The number of completed surveys anticipated exceeds the number of those estimated for the 2010 Survey study, which was an estimated return rate of 60%. Table B.3-1 indicates the expected response rate by association. The expected response rate for the “Other” group is the average response rate of the other four partnering associations from the 2010 Survey.
Table B.3-1. Expected response rate by Organizational Stratum
|
AIHA |
ASSE |
ACOEM |
AAOHN |
Other* |
Total |
Number of surveys administered |
500 |
500 |
500 |
500 |
500 |
2,500 |
Expected number of surveys returned |
300 |
300 |
300 |
300 |
300 |
1,500 |
Expected response rate |
60% |
60% |
60% |
60% |
60% |
60% |
Actual response rate in 2010 survey |
48% |
40% |
34% |
41% |
N/A |
41% |
*The segment of the membership labeled as “Certified Safety Professionals” (CSPs)
† Members of American Insurance Association (AIA), Insurance Loss Control Association (ILCA), and National Fire Protection Association (NFPA)
Plans to maximize response rates: A variety of measures have been taken to assure a high rate of returns to this survey:
Electronic letters from all partnering organizations will be embedded into the recruiting emails sent to each of the potential participants in offering support for the project and encouraging participation. These letters will reinforce the importance of the survey.
The partnering organizations will be asked to announce and promote the survey in advance of the recruiting email through their respective newsletters and house publications. This will help increase awareness of the survey and should increase the number of sample members that recognize the mailing and take the time to respond.
Hyperlinks to PDF and mobile app downloads of the NIOSH Pocket Guide to Chemical Hazards will be included in the initial recruiting email as an incentive to all of those who participate. As noted earlier, research indicates that an incentive enclosed with the survey has a stronger impact on response rates than incentives provided upon completion of the survey [Dillman 2000]. Pilot tests of previous CSSs suggest that these incentives are appropriate and desirable for the target participants.
An attractive, well-designed, and user-friendly questionnaire addressing topics of importance to the target sample will facilitate response. The online survey and the website that hosts it will be formatted for ease of response by all participants.
In addition, several questions and response categories have been revised so that they are perceived as relevant to wider range of OSH professionals, in addition to those that have been included in the sampling frames of the 2003 and 2010 surveys.
Individuals will be given the option of responding to the short-version survey, making it easier for those who do not want to complete the full-version survey to participate in the study. Multi-mode surveys with an Internet option have been shown to yield higher response rates than surveys using only a single response option [McMahon et al. 2003].
Follow-up mailings will be conducted to enhance response rates. It is well-established that higher mail survey response rates are achieved when repeated contacts, in the form of follow-up letters and appeals are employed [Dillman 2000]. Therefore, in addition to the announcements made by the partnering organizations to their members and the recruiting email introducing the survey, we will send two follow-up emails at one-month intervals.
Plan to evaluate potential nonresponse bias: Two approaches to the computation of nonresponse adjustment factors will be considered and both approaches assume that there are characteristics for both respondents and nonrespondents that are associated with the propensity to respond and the key survey statistics. If there are only a few potential characteristics on the sampling frame for nonresponse adjustment, the weighting class method may be used. A cross classification of the characteristics would be constructed and in each cell of the cross classification, a nonresponse adjustment factor would be computed. The nonresponse adjustment factor would be the sum of the base weights of all sampled cases (respondents plus nonrespondents) in the cell divided by the sum of the base weights of all respondents in the cell. If there are several potential characteristics on the sampling frame that could be used for nonresponse adjustment, consideration could be given to building a response propensity model using logistic regression. Here the dependent variable is whether or not the sampled individual completed a questionnaire. After estimating the parameters of the logistic regression model, the predicted likelihood of response could be computed for each individual. The inverse of the predicted likelihood of response would then be applied to the base weight.
4. Tests of Procedures or Methods to be Undertaken
The majority of the data collection procedures and the questions on the survey were drawn from the baseline survey (OMB No. 0920-0544) in 2010. The majority of the questions were drawn from the baseline survey, as detailed in Attachment 7b.
In developing the Customer Satisfaction and Impact (CSI) Survey, the 2010 Survey was revised (as previously described) and reviewed by leaders of 12 Divisions, Offices, and Programs. Their feedback was synthesized and incorporated into the final revision. Dr. Bolanle Olaniran, Professor of Communication Studies at Texas Tech University (Texas Tech University, Department of Communication Studies, Box 43082, Lubbock, TX 79409; email: b.olaniran@ttu.edu; Tel (806-834-3978) provided input on construction of the survey, design of the study, and statistical procedures for analyzing the results. Dr. Olaniran has extensive experience in researching the use of information technologies and has expertise in quantitative survey design.
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Dr. Juliann Scholl, Health Communication Fellow from the Education and Information Division (EID) NIOSH, (4676b Columbia Parkway, Cincinnati, OH 45226, phone (513-533-8178), oversaw all aspects of project design. Dr. Scholl is responsible for receiving and approving contract deliverables. Dr. Paul Schulte (Director of EID), Dr. Donna Van Bogaert (EID), Dr. Vern Anderson (EID), Dr. TJ Lentz, Dr. Tom Cunningham, all of EID, are Senior Staff and were consulted for their expertise in data collection methods and survey design. Dr. Donald Eggerth, Mr. Michael Flynn, and Ms. Rebecca Guerin, from the Training Research and Evaluation Branch within EID are directors of various programs and were consulted for their survey design and data collection expertise. Qualified staff from the research firm under contract will serve as the primary liaison with the partnering organizations, randomly select the sample, collect and analyze the data, track respondents, and prepare reports that summarize study results.
As explained previously, the CSI Survey was developed from the 2010 Survey, a project led by Dr. Vern Putz Anderson, Research Psychologist from the Education and Information Division NIOSH, (4676 Columbia Parkway, Cincinnati, Oh 45226- Phone (513) 533-8319), Technical Monitor, who oversaw all aspects of project design. Mr. Charles Wolters, a statistician from Battelle Centers for Public Health Research and Evaluation (6115 Falls Rd Ste 200, Baltimore, MD 21209; Tel (410) 372-2732) designed the sampling plan and analyzed the results of the survey. Ms. Marianne Story-Yencken (Battelle Seattle office, 1100 Dexter Ave N, Suite 400, Seattle, WA 98109; Tel (206) 528-3164), Certified Industrial Hygienist, was the primary liaison with the four partner associations. Dr. Lisa John, Project Director at Battelle Centers for Public Health Research and Evaluation (10420 Old Olive Street Road, Ste 300, St. Louis, MO 63141; Tel (314) 993-5234) oversaw the random selection of the sample of respondents, the printing and mailing of the forms, and the collection and processing of the returns. Dr. John and Mr. Wolters, both of Battelle, analyzed the data and prepared reports summarizing the results of the survey.
References
Dillman, D. (2000) Mail and Internet Surveys. New York: John Wiley & Sons, Inc.
McMahon S, Iwamoto M, Massoudi M, Yusuf H, Stevenson J, Da Chu S et al. Comparison of e-mail, fax, and postal surveys of pediatricians. Pediatrics 2003; 111:299-303.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Scholl, Juliann C. (CDC/NIOSH/EID) |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |