OMB No.: 0920-0735
Division of eHealth Marketing (DeHM)
National Center for Health Marketing (NCHM)
Centers for Disease Control and Prevention
CDC Project Manager
Carol Crawford, Deputy Director
Division of eHealth Marketing (DeHM)
National Center for Health Marketing (NCHM)
Centers for Disease Control and Prevention (CDC)
404-498-2480
February 10, 2010
INTRODUCTION
This revision of 0920-0735, CDC Website Usability Evaluation, expiration date March 31, 2010, requests 3 more years of approval for usability surveys on the Centers for Disease Control and Prevention (CDC) Web site as well as clearance for usability surveys on social media, mobile-based or other electronic communication channels hosting CDC content. This amended package includes minor revisions and additions, along with a bank (library) of generic questions to be used. With the previous Usability Evaluation package, various groups around the agency were able to conduct useful surveys assessing the usability of a variety of CDC Web sites. The CDC.gov Homepage and other CDC Web sites were redesigned based on usability surveys conducted and the resulting designs improved performance for Web site users. The next step is to continue usability surveys on more Web sites, staying abreast of changes in target audience needs and online behavior as well as survey users of CDC social media, mobile-based or other electronic communication channels hosting CDC content that currently exist or will emerge during the life of this package. Usability surveys determine how well CDC’s Web site, social media, mobile-based or other electronic communication channels hosting CDC content are performing. Observation and data collection on how users interact with the Web site or other electronic communication channels hosting CDC content are critical in ensuring that users can find information, that the Web site or other electronic communication channels are easy to use and designed to meet the needs of specific audiences. This package requests clearance for two types of surveys: remote or in person. Remote surveys will collect data about how participants interact with CDC’s Web site, social media, mobile-based or other electronic communication channels hosting CDC content. Users will take the survey at their home or work computers. In person surveys will have participants take the survey in a central location where their data can be captured electronically, as with the remote survey, but also the participants can be directly observed. The direct observation of in person surveys allows for enhanced collection of information such as observation of facial expressions and listening to verbal feedback. The question bank is needed because every survey will be based on specific health issues or topics, or target audiences; it is not possible to develop one instrument for use in all instances. This package provides a list of generic tasks and questions for the surveys that can be used to develop a survey for a specific CDC Web site, social media, mobile-based or other electronic communication channel hosting CDC content. Screening questions (comprised of demographic, introductory, or core questions) are also included in the package, and a subset of these screening questions will be used to create the proper sample for each usability survey. Participants in a usability survey are reflective of the target audience for a CDC Web site, social media, mobile-based or other electronic communication channel hosting CDC content.
A. JUSTIFICATION
A-1. Circumstances Making the Collection of Information Necessary
Background
Executive Order 12862 (Appendix 1) directs Federal agencies that provide significant services directly to the public to survey customers to determine the kind and quality of services they need and their level of satisfaction with existing services. The Centers for Disease Control and Prevention (CDC) seeks to obtain approval to conduct usability surveys on CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content on an ongoing basis.
It is important for CDC to ensure that health information, interventions, and programs at CDC are based on sound science, objectivity, and continuous customer input. The CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content must be designed to be easy to use, easy to access, and effective providers of health information and resources to our target audiences.
CDC is requesting renewal of our existing 3-year generic clearance, with revisions, to carry out its mission. Generic clearance is needed to ensure that CDC can continuously improve its Web sites, social media, mobile-based or other electronic communication channels hosting CDC content though regular surveys developed from these pre-defined questions.
Surveying the CDC Web site, social media, mobile-based or other electronic communication channels hosting CDC content on a regular, ongoing basis will help ensure that users have an effective, efficient, and satisfying experience on any of our Web sites or communication channels, maximizing the health impact of the information and resulting in optimum benefit for public health. The surveys will ensure that all CDC Web sites and electronic communication channels meet customer and partner priorities, build CDC’s brand, and contribute to CDC health impact goals.
This survey is authorized under the Public Health Service Act (42 USC 241) Section 301. A copy of the legislation is included (Appendix 2).
Privacy Impact Assessment
Overview of the Data Collection System
CDC employees, fellows, full-time contractors, or contract vendors will collect the data for these surveys. The data collected will include some or all of the following: background participant information, time and routes taken through the Web site or other electronic communication channels to complete certain online tasks, experience using the electronic communication channel, and overall satisfaction measures.
As discussed further in Section A-3, data will be collected electronically, whenever possible, to reduce the burden to the respondent. Typically, respondents will click on a “radio button” or “checkbox” that corresponds to their response. For open-ended questions in usability surveys, the respondent would typically be told to enter their answer in the provided text box. Remote surveys allow CDC to collect data from a variety of audiences across the nation and in person surveys allow for more direct observation of users on the CDC Web site or other electronic communication channels. In some cases, observation of the surveys or viewing of video or audio tapes from the surveys is critical in getting changes implemented by and resolving differences of opinion among staff, senior management, and other involved parties. Use of audio and video tapes is standard protocol by usability professionals.
Items of Information to be Collected
No names or other information that could identify the respondent will be recorded. A code number will be assigned to an individual’s responses. For remote surveys and clickstream technology, it is not possible for CDC to link this information to the IP (Internet Protocol) addresses of participants, and the survey software will not do so. Thus, there is no way to identify respondents.
For remote surveys, where CDC has e-mailed, phoned or written to request participation, the names collected for the purposes of contacting potential participants will not be recorded or kept with the survey responses. We will only know that we asked the participant to take the survey, not whether they did, and we will not be able to match results with participants.
For in person surveys, some CDC staff will have the name of the participant in order to grant them access to CDC facilities; however, this information will not be recorded with survey results. Only a code number will be utilized.
Identification of Website(s) and Website Content Directed at Children Under 13 Years of Age
The information collected may involve web-based or other electronic data collection methods that may refer respondents to Web sites, social media, mobile-based or other electronic communication channels hosting CDC content. It is not likely that these communication channels will host content directed at persons less than 13 years of age. Typically, only communication channels hosting CDC content will be visited by respondents during the survey. For example, the respondent may be asked to visit the CDC Web site, which uses no persistent cookies and contains a privacy policy and rules of conduct, in accordance with Federal law.
Respondents may visit non-CDC Web sites during the survey. For example, content syndication has proven to be a useful mechanism for CDC to share its scientific content with state and local health departments, partners, etc who may place in on their Web site or utilize it in their other electronic communication channels. Such Web sites may have content for persons of all ages and they may use cookies. However, it is necessary to assess the usability of CDC content on non-CDC Web sites, as we want to ensure that all users of CDC content have an efficient, effective, and enjoyable experience.
A-2. Purpose and Use of the Information Collection
The current Usability Evaluation package expires March 31, 2010. CDC is resubmitting its generic clearance of the Usability Evaluation package. The submission of this amended package includes minor revisions and additions, along with a bank (library) of generic questions to be used. The question bank is needed because every survey will be based on specific health issues or topics, or target audiences; it is not possible to develop one instrument for use in all instances.
With the previous Usability Evaluation package, various groups around the agency were able to conduct useful surveys assessing the usability of a variety of CDC Web sites. These surveys covered important CDC programs and topics, such as Seasonal Flu, Tuberculosis, HIV, STDs, and Chronic Diseases. The CDC.gov Homepage and other CDC Web sites were redesigned based on usability surveys conducted within this package and the resulting designs improved performance for Web site users and won numerous awards, both within and outside of, the Federal government space. For example:
CDC.gov on Nextgov Top 5 Picks - Best Practices for Government Web Sites
On Monday, February 2, 2009, Nextgov Technology and the Business of Government issued a story on Best Practices for Government Web Sites, by Melanie Bender. CDC.gov was chosen among five federal Web sites who employ the best online practices and pay careful attention to what users want to see and do online. The Top 5 federal Web sites include: NASA, Library of Congress, Centers for Disease Control and Prevention, Social Security Administration, and Transportation Security Administration.
CDC.gov included in Best Health News Online
As reported by Examiner.com National's Best Health (9.30.08), CDC.gov was identified as one of the top sites for staying abreast of the most up-to-date health news and scientific innovations. The Top Sites include: eMedicineHealth.com/WebMD.com, Healthscout.com, Mayoclinic.com, Revolutionhealth.com, NIH.gov, CDC.gov, Healthfinder.gov, and Sciencedaily.com. Examiner.com article excerpt: CDC.gov is the CDC's primary online communication, with about 41 million page views per month. Whether it's for personal use, a school report or data aggregations, this site provides the most current data and statistics on diseases and conditions, emergencies and disasters, environmental health and healthy living – along with news on research and government health policies. Stay informed on the latest in vaccinations, disease prevention and burgeoning treatments.
CDC.gov Receives 2008 Government Web Manager's Web Best Practice Awards
CDC.gov and the VA Midsouth Healthcare Network shared the top prize in the third annual competition sponsored by the Federal Web Managers Forum, a consortium of more 1,600 federal, state, and local government Web staff. The award presented by Web management peers resulted in six finalists selected by judges based on the following criteria: identifying the top customer task, making such tasks easier to perform, measuring success, and improving the ease of performing critical tasks. The entire membership of the Federal Web Managers Forum had the opportunity to vote for award winners and select the recipients of the top prize.
And many more awards, both within and outside of, the Federal government space
The next step is to continue usability surveys on more Web sites, staying abreast of changes in target audience needs and online behavior as well as survey users of CDC social media, mobile-based or other electronic communication channels hosting CDC content that currently exist or will emerge during the life of this package. CDC is currently using mobile web sites, text messaging, online quizzes, widgets, podcasts, eCards, online video, motion graphics, blogs, syndicated content, and other communication channels and will continue to explore other channels which provide CDC content to target audiences when, where, and how they want and need it. As new channels emerge, CDC will explore using them to deliver its content.
New communication channels have and will continue to emerge after an Umbrella OMB package is approved. According to the Pew Internet & American Life Project, “in 2000, 46% of American adults had access to the internet, 5% of U.S. households had broadband connections, and 25% of American adults looked online for health information. Now [2008], 74% of American adults go online, 57% of American households have broadband connections, and 61% of adults look online for health information.” This increase in access to the Internet coincides with an increase in Social Media usage. For example, Blogosphere growth remains strong with over 120,000 new blogs being created every day (as reported by Sifry.) Access of mobile Web sites in the U.S has grown three-fold in the past year, according to Bango.
Other Social Media channels, including Social Networks such as MySpace and Facebook also show steady increase in growth. As new communication channels and technologies emerge, it becomes important for government agencies to be accessible to their constituents and provide necessary information in arenas outside their Web site. Additionally, in many cases these new technologies can reduce the burden on the participant, for example, answering survey questions via mobile phones when accessing the CDC Web site on one’s mobile phone. There should be flexibility to use emerging communication channels and ask related questions if they do not increase the burden on the participant. In some instances, the wording of a question from the bank in the appendices may need to be shortened to fit within the constraints of a particular communication channel. In shortening a question, the meaning will be preserved and the burden on the participant will not be increased.
The entire CDC Web site gets over 150 million hits a month and contains over 200,000 pages of information, products, guidelines, and training focused on diseases, health conditions, and public health. The CDC Web site is comprised of multiple smaller sites, and most usability surveys will focus on evaluating just one portion of www.cdc.gov or one CDC communication channel. The CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content are accessible to everyone on the World Wide Web and have many different audiences, including public health professionals, physicians, media, policy makers, and the general public.
By collecting usability information on CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content, CDC will be able to serve and respond to the ever-changing demands of its Web site and other communication channel users. Additionally, we will be able to determine the best way to present messages to target audiences.
Target audiences include individuals (such as patients, educators, students, etc.), interested communities, partners, healthcare providers, and businesses. Survey information will augment current content and delivery of the CDC Web site and other electronic communication channels, and design surveys which are used to understand the user, and more specifically, the CDC user community.
Privacy Impact Assessment
The purpose of such usability surveys is to judge the content and presentation of the CDC Web sites, social media, mobile-based or other electronic communication channels though which CDC communicates scientific health information to its target audiences to help ensure that health impact is maximized through the delivery of a useful, efficient, and effective communication channels.
Primary objectives are to determine whether CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content:
(1) Meet the wants, preferences, and needs of its target audiences.
(2) Are an effective vehicle for sending messages to target audiences.
(3) Provide users with the kind and quality of services they need.
(4) Deliver existing services at a satisfying level of quality.
Findings will help to:
(1) Understand the user community and how to better serve CDC users.
(2) Identify areas requiring improvement in either content or delivery.
(3) Determine how to align communication channels with identified user need(s).
(4) Determine the kind and quality of services our target audiences need.
(5) Explore new or refined methods for offering, presenting and delivering information most effectively, to enable us to present messages as well as serve the needs of people who are already seeking particular information or want to learn about a particular topic.
The data collected from this effort will allow us to answer critical usability questions, including:
What are the needs and preferences for our target audiences?
How often and for what purposes (there can be several simultaneously) do our target audiences typically use the CDC Web site or other communication channels?
How satisfied are they with their experience on the CDC Web site or other communication channels?
What difficulties do they experience when trying to complete typical tasks on the CDC Web site or other communication channels?
In what ways can we improve their speed and ability to find the information they want, expect or need on the CDC Web site or other communication channels?
Were messages on the site presented in such a way that they are noticeable, easy to understand, easy to remember, and have an impact on the viewer’s behavior plans?
How does their awareness of, knowledge of, and opinions on a health topic change after viewing the CDC Web site or other communication channels?
Did they find information/messages about health issues they weren’t initially looking for when viewing information on the site? Did the message have an effect, e.g. change their behavior plans?
Are they satisfied with the services offered through the CDC Web site or other communication channels?
What improvements would the user like to see made to the existing services on the CDC Web site or other communication channels?
What other services do they need?
What existing and emerging communication channels provide the best user experience for accessing and disseminating CDC content?
The survey will help ensure that the CDC Web site and other electronic communication channels meet user and agency needs, build CDC’s brand, and contribute to health impact goals. Feedback from the user base is necessary to fully judge the performance of CDC’s Web site and communication channels.
No sensitive information and no IIF will be collected. The proposed data collection will have little or no effect on the respondent’s privacy. No names or other information that could identify the respondent will be recorded - a code number will be assigned to an individual’s responses, as discussed further in Section A-10.
All data collected through the survey will be used to determine whether CDC should revise content, labels, structure, or layout of its Web sites, social media, mobile-based or other electronic communication channels hosting CDC content. If indicated, revisions would be intended to increase the success rate of information-seeking users.
A-3. Use of Improved Information Technology and Burden Reduction
Data will be collected electronically, whenever possible, to reduce the burden to the respondent.
Typically, respondents will click on a “radio button” or “checkbox” that corresponds to their response. For open-ended questions in usability surveys, the respondent would typically be told to enter their answer in the provided text box. We have attempted to keep the format of the survey simple with short questions and clearly labeled and scaled answer choice-sets. There may also be specific tasks where we ask the respondent to find an answer to a specific question (for example, Activity / Task Questions, Appendix 8). We will usually be able to determine how they found the answer by using clickstream technology which will lessen their time requirement to complete the survey since they will not have to self-record their movements through CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content. Additionally, users will sometimes be able to copy and paste from the Web site or other electronic communication channels to reduce the time required to answer open-ended questions.
Each survey will contain questions from some or all of the following question sets: Consent Forms (Appendix 4), Demographic Questions (Appendix 5), Introductory Questions (Appendix 6), Core Questions (Appendix 7), Activity/Task Questions (Appendix 8), and Follow up Questions (Appendix 9). There are no “standard questions” which will be asked in every survey because surveys will vary in scope. In the interest of the participant’s time and reducing burden to the participant, each survey will ask only those questions which are absolutely necessary to improve the specific program’s Web site, social media, mobile-based or other electronic communication channels hosting CDC content. Studies will allow for cross-topic, cross-Web page, or cross-electronic channel comparison of results, which will provide insight into how to manage the portfolio of pages and topics within CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content.
The set of survey questions included in this package were gathered from (1) previous usability surveys conducted at CDC or (2) recommended usability questions used by other usability professionals in other organizations and are considered best practices. In determining which questions to include in the package, usability professionals across CDC were consulted and questions that had poor performance in the past or were not considered best practices were discarded. Because we are requesting a 3-year generic clearance for a wide variety of possible usability surveys on CDC Web site, social media, mobile-based or other electronic communication channels hosting CDC content, the list of questions is large enough that this package can cover all potential survey scenarios needed. However, each survey is limited to a reasonable length of time and CDC staff will not incorporate every question in the survey.
Appendices 5 – 9 include survey questions that were previously approved with this package as well as some new questions that we are requesting approval of with this renewal. Appendices 5 – 9 include all questions, (previously approved and new). Appendix 11 lists just the new questions.
The remote surveys allow CDC to collect data from a variety of audiences across the nation. In many cases, participant responses are captured electronically, as well as participants’ clicks through the CDC Web site, social media, mobile-based or other electronic communication channels hosting CDC content. In person surveys allow for more direct observation of users on the CDC Web site or other electronic communication channels and are needed in some instances to gain even greater insight into users’ behaviors. During an in person survey, the responses are often tracked electronically, as they are for the remote surveys; however, these in person surveys also allow CDC to observe where users place their mouse (before clicking), observe physical responses (shock, confused looks, etc.), etc and allow them to explain why they are clicking on various parts of the Web site or other electronic communication channels as they are doing it. These responses can be recorded using video and audio tapes and analyzed along with the electronic responses. Observation of the surveys or viewing of video or audio tapes from the surveys is critical in getting changes implemented by and resolving differences of opinion among staff, senior management, and other involved parties. Use of audio and video tapes is standard protocol by usability professionals.
During initial surveys of CDC’s Web site, social media, mobile-based or other electronic communication channels hosting CDC content, it is important to perform in person surveys when possible to add this enhanced feedback. Remote surveys are typically performed as follow up surveys, on low-trafficked sites, or when an in person survey is not feasible due to limited time, budget, staff, or other resources.
A-4. Efforts to Identify Duplication and Use of Similar Information
Prior to approval of OMB No. 0920-0735, no similar information existed, although some usability surveys had been conducted on a few, specific pages of the CDC Web site and other communication channels. OMB No. 0920-0572 for health message testing has also occasionally been used to collect similar information. Approval of this package will greatly expand CDC’s ability to perform usability surveys on CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content.
A-5. Impact on Small Businesses or Other Small Entities
There is no burden on small businesses or small entities.
A-6. Consequences of Collecting the Information Less Frequently
There are a number of potential negative consequences if these data are not collected. In addition, if the collection is not conducted on an ongoing basis, we will not have valuable data needed to routinely revise messages and reorganize online health information in a way that is most easily understood and accessed by those who use CDC Web site and other electronic communication channels. Specifically, without this data there would be:
No performance measures by which to determine effectiveness of a CDC Web site or other communication channels as a tool for our visitors and a message channel. This results in lowered user satisfaction, fewer return visits, and decreased information dissemination.
No user data to include in Web site or communication channel design decision-making to ensure that user experience on our site is efficient, effective, and enjoyable. This results in an unfocused approach to design in which we are unable to determine whether our Web site or communication channels are useful or not.
No two-way communication between CDC users and electronic communication channel coordinators. Two-way communication and user feedback is essential to the proper production and dissemination of health information and it is widely used in the field of public health; we need to implement such a process for our Web-based products and all communication channels , as well.
Vital feedback regarding customer and/or partner satisfaction with various aspects of the CDC’s communication channels will be unavailable.
Usability surveys will only be conducted at intervals considered appropriate to measure the impact of changes to CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content and to monitor the level of performance. In most cases each section of the CDC Web site or an electronic communication channel will likely conduct usability surveys annually or biannually after the establishment of a baseline. Collection on a less frequent basis than annually or biannually will likely reduce the practical utility of the information and inhibit CDC’s ability to monitor changes.
We are only expecting one-time responses from respondents. Therefore, it is not possible to ask participants to fill out the survey less frequently. There are no legal obstacles to reduce the burden.
A-7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
There
are no special circumstances with this information collection
package. This request fully complies with the guidelines of 5 CFR
1320.5.
A-8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A. A 60-day Federal Register Notice was published in the Federal Register on Thursday, December 17, 2009, Vol. No. 74, No. 241, pp. 66977-66078 (Attachment 3). No public comments were received.
Although no outside consultation was used, extensive review and input was received from Usability Specialists and Web coordinators across the Agency, including:
Glenn Doyle, Technical Information Specialist, CDC National Institute for Occupational Safety and Health, 513.533.8386, gdoyle@cdc.gov
Susan Leonard, Health Communication Specialist, CDC Division of Nutrition and Physical Activity, 770.488.5233, sleonard1@cdc.gov
Sharon McAleer, Webmaster, CDC National Center for Human Immunodeficiency Virus, Sexually Transmitted Diseases, and Tuberculosis Prevention, 404.639.5089, smcaleer@cdc.gov
Thomas Cona Usability Analyst, CDC National Center for Chronic Disease Prevention and Health Promotion, 678.530.8884, tcona@cdc.gov
Susan Schuffenhauer, Usability Analyst, CDC National Center for Infectious Diseases, 404.639.2939, sschuffenhauer@cdc.gov
Lisa Richman, Web Lead - Health Communication Science Office (HCSO), National Center for HIV/AIDS, Viral Hepatitis, STD and TB Prevention (NCHHSTP), 404.639.8535, lrichman@cdc.gov.
Nick Sabadosh, User Experience Manager, CDC National Center for Health Marketing, 404.639.8051, nsabadosh@cdc.gov
Sarah Greer, User Experience Designer, CDC National Center for Chronic Disease Prevention and Health Promotion, 678.530.8948, sgreer1@cdc.gov
Catherine Jamal, User Experience Specialist and Assistant Team Lead, Emergency Web Team, Division of Emergency Operations, Office of Public Health Preparedness and Response, 404-639-4307, cjamal@cdc.gov
Carol Crawford, Deputy Director, Division of eHealth Marketing, CDC National Center for Health Marketing, 404-498-2480, ccrawford@cdc.gov
A-9. Explanation of Any Payments or Gift to Respondents
CDC will not directly provide remuneration to respondents. However, some respondents may receive remuneration through recruitment companies contracted to obtain participants. CDC may use these recruitment companies to find participants for larger surveys or when it is difficult to find specific types of audiences willing to participate, e.g. clinicians. It is typical for recruitment companies to provide remuneration to users as part of their practices. The amount of remuneration is based on pay scales these companies follow. CDC will pay a fixed price to a recruitment company for their services and not specifically for any set remuneration. The recruitment company would have full names and addresses of participants but this information would never be supplied to CDC or stored with any survey data or results.
A-10. Assurance of Confidentiality Provided to Respondents
Privacy Impact Assessment A
The Privacy Act does not apply to data collections conducted according to procedures described in this application. All questions for the surveys to be conducted under this OMB approval are included within this Information Collection Request.
Privacy Impact Assessment B
As discussed in Section A-1, no names or other information that could identify the respondent will be recorded. A code number will be assigned to an individual’s responses. For remote surveys and clickstream technology, it is not possible for CDC to link this information to the IP (Internet Protocol) addresses of participants, and the survey software will not do so. Thus, there is no way to identify respondents.
For remote surveys, where CDC has e-mailed, phoned or written to request participation, the names collected for the purposes of contacting potential participants will not be recorded or kept with the survey responses. We will only know that we asked the participant to take the survey, not whether they did, and we will not be able to match results with participants.
For in person surveys, some CDC staff will have the name of the participant in order to grant them access to CDC facilities; however, this information will not be recorded with survey results. Only a code number will be utilized.
Privacy Impact Assessment C
All participants will be informed at the beginning of the activity (prior to participation) that their responses will be treated in a secure manner, that all data will be safeguarded closely, and that no individual identifiers are planned to be used in survey reports. All participants will be review a consent form (examples in Appendix 4).
Privacy Impact Assessment D
Respondents will be advised of the nature of the activity, the length of time it will require, and that participation is purely voluntary. Respondents will be assured that they will not incur penalties if they wish to not respond to the information collection as a whole or to any specific questions. These procedures conform to ethical practices for collecting data from human participants. All information provided by respondents will be treated in a secure manner, unless otherwise compelled by law.
This project is exempt from IRB requirements. Data collection activities permitted under 0920-0735 are not research and therefore do not require IRB review.
A-11. Justification for Sensitive Questions
Questions concerning Race and Ethnicity may be considered sensitive by a portion of respondents. Race and Ethnicity questions are included in the set of Demographic questions that may be asked of respondents. Where relevant to the usability evaluation of CDC Web sites, social media, mobile-based or other electronic communication channels hosting CDC content, Race and Ethnicity data will be collected consistent with HHS policy and standard OMB classifications.
A-12. Estimates of Annualized Burden Hours and Costs
There will be two lengths of surveys conducted, depending on whether the survey is In Person or Remote. An In Person survey will last an average of 60 minutes and take place at a CDC computer; a remote survey will last less than 60 minutes and may take place at the participant’s computer. These estimates were determined through analysis of times from previous usability surveys using similar questions, survey of usability professionals to ascertain average times for users to perform tasks, and a pilot survey of 10 internal users comprised of CDC staff and CDC contractors Some remote surveys will take much less time. The majority of usability surveys conducted at CDC have been done remotely, thus we estimate that in the future more surveys will probably be done remotely rather than in person.
Estimate of survey respondents was based on an estimate of the ideal number of usability surveys that CDC would conduct over a 3 year period. Factored in was initial surveys and subsequent follow up surveys utilizing a satisfactory number of participants. It is anticipated that most of CDC’s sites will require some sort of usability survey. Additionally, CDC anticipates conducting a number of important baseline surveys for its home page and other highly trafficked sub-sites in order to continually refresh these pages as part of CDC’s priority to more effectively utilize its Web sites, social media, mobile-based or other electronic communication channels hosting CDC content.
Estimated Annualized Burden Hours
|
Survey Type |
Number of Respondents |
Frequency of Response per Respondent |
Avg. Burden Per Response (hrs.) |
Burden Hours |
|
In Person Surveys |
8,000 |
1 |
1 |
8000 |
|
Remote Surveys |
67,000 |
1 |
30/60
|
33,500 |
|
Total |
75,000 |
|
|
41,500 |
An average hourly salary of approximately $18.09 is assumed for all respondents, including clinicians and scientific users, based on the Department of Labor (DOL) National Compensation Survey. Because of the scope of this generic clearance and the variety of the types of participants, the average salary was utilized rather than attempting to estimate salaries for groups of audiences. With a maximum annual respondent burden of 41,500 hours, the overall annual cost of respondents’ time for the proposed interviews is estimated to be a maximum $750,735.00 (41,500 hrs x $18.09). There will be no direct costs to the respondents other than their time to participate in each survey.
Estimated Annualized Burden Cost
Total Respondent Hours |
Hourly pay rate |
Total Respondent Burden |
41,500 |
$18.09 |
$750,735.00 |
A-13.
Estimates of Other Total Annual Cost Burden to Respondents or Record
Keepers
There
are no additional costs to the respondents. There is no burden to
record keepers.
A-14. Annualized Cost to the Federal Government
Usability surveys will be prepared by contractors or CDC staff (FTE). An FTE manager will review all surveys. Usability teams will vary across CDC teams but typically an FTE and contractor will work together on survey preparations, conducting the surveys, and analyzing data. Additionally, a senior level FTE will typically review and approve the activities. The amount of time staff and contractors spend on surveys will vary depending on the number of participants for each survey, the number of questions, and the Web site, social media, mobile-based or other electronic communication channel being surveyed. An average number of 200 surveys a year was assumed for estimation purposes. Overall time spent by CDC staff and contractors is lessened as this package provides tasks and questions to be used in the survey; thus, reducing time staff normally would have spent developing these questions.
Staff or Contractor |
Average Hours per Study |
Average Hourly Rate |
Average Cost |
Contractor instrument preparation, conduction, analysis (GS-12/GS-13 equivalent) |
20/survey |
$36.00 |
$720/yr |
FTE survey preparation, conduction, analysis (GS-13) |
20/survey |
$39.00 |
$780/yr |
FTE manager survey review (GS-14) |
5/survey |
$45.00 |
$225/survey |
Average Costs per survey |
|
|
$1725 |
Average 1 Year Cost (based on 200 surveys per year) |
|
|
$345,000 |
A-15. Explanation for Program Changes or Adjustments
This
ongoing data collection for CDC is essential to ensuring that users
of CDC Web sites, social media, mobile-based or other electronic
communication channels hosting CDC content are able to access
information.
A-16. Plans for Tabulation and Publication and Project Time Schedule
Activity |
Time Schedule |
1. Determine which Web site or other electronic communication channel will be surveyed. 2. Determine survey method and survey questions. 3. Determine target participation, quotas. 4. Determine method of recruitment. |
Within 14 days after approval of this package. |
5. Recruit participants for survey. (See B-1) 6. Invitation link may be posted on Web site or other electronic communication channel and active respondent recruitment begins. |
Within 28 days after approval of this package. |
7. Completion of surveys. |
Up to 60 days after OMB approval. |
8. Analysis of surveys. |
Up to 21 days after survey completion. |
9. Adjustment of Web site or other electronic communication channel based on results of the survey. |
Up to 60 days after survey analysis. |
A-17. Reason(s) Display of OMB Expiration Date is Inappropriate
Exemption is not being sought. The OMB expiration date will be displayed.
A-18. Exceptions to Certification for Paperwork Reduction Act Submission
There are no exceptions to certification.
CDC Project Manager
Carol Crawford, Deputy Director
Division of eHealth Marketing (DeHM)
National Center for Health Marketing (NCHM)
Centers for Disease Control and Prevention (CDC)
MS E-90, 1600 Clifton Road, Atlanta, Georgia 30333
404-498-2480
List of Attachments
Executive Order 12862
Public Health Service Act (42 USC 241) Section 301
60-day Federal Register Notice
Consent Forms
Demographic Questions
Introductory questions
Core questions
Activity/Task questions
Follow up questions
Example Survey and Screen Shots
File Type | application/msword |
Author | PHPPO_User |
Last Modified By | shari steinberg |
File Modified | 2010-02-11 |
File Created | 2010-02-11 |