April 11, 2014
Supplemental Questions for DOC/NOAA Customer Survey Clearance
(OMB Control Number 0648-0342, expiration date: 4/20/2015)
National Ocean Service/Coastal Services Center – Online Learning Product Evaluation
1. Explain who will be conducting this survey. What program office will be conducting the survey? What services does this program provide? Who are the customers? How are these services provided to the customer?
This survey will be conducted by staff at the NOAA/NOS Coastal Services Center (Center) to assess customer views regarding http://csc.noaa.gov/digitalcoast/needs-assessment-guide. The Center serves the needs of coastal and marine natural resource management programs and professionals (e.g., state natural resource management agencies and staff, conservation organization staff) through development and delivery of data and information products, decision-support tools, professional development training, and technical assistance on a variety of topics. Data and information products and decision support tools are delivered per customer requests via online systems (e.g., clearing house, direct download). Technical assistance is conducted via telephone or electronically or remotely (on-site) depending on the needs of the specific customer and the specific technical assistance topic.
2. Explain how this survey was developed. With whom did you consult during the development of this survey on content? Statistics? What suggestions did you get about improving the survey?
The senior instructional designer for the Center recommended this approach based on experience and reviewing industry standards for online feedback. The format and two questions were vetted with the Center’s instructional Design Group, the Learning Services Division program managers, the Center’s Web Design Services department and the Center’s expert in Survey Design. Suggestions included minor modifications to wording, use of familiar voice in the Likert scale, 4 choices forced response to make the data more actionable and creating an instrument that can be used for any online learning product the Center produces.
3. Explain how the survey will be conducted. How will the customers be sampled (if fewer than all customers will be surveyed)? What percentage of customers asked to take the survey will respond? What actions are planned to increase the response rate? (Web-based surveys are not an acceptable method of sampling a broad population. Web-based surveys must be limited to services provided by Web.)
This survey will be formatted and embedded in the online Needs Assessment Guide and will be administered using a pop-up window. The survey window will pop-up after the user has completed a specified number of clicks in the site or after a specified time on the site (TBD). Users of the website will be sampled based on their level of interaction or engagement with the site, determined by these two parameters. The minimum expected response rate is 10%, based on past information collections conducted by the Coastal Services Center related to online training and other online learning products. The number of questions will be limited to two in an effort to achieve a higher response rate relative to more lengthy surveys. The user will have to option not to respond by clicking a “no thanks” button.
4. Describe how the results of this survey will be analyzed and used. If the customer population is sampled, what statistical techniques will be used to generalize the results to the entire customer population? Is this survey intended to measure a GPRA performance measure? (If so, please include an excerpt from the appropriate document.)
Survey data will be analyzed qualitatively, with basic descriptive statistics (e.g., percent, mean scores) only. This information collection seeks to assess general customer feedback to inform improvements to the Needs Assessment Guide http://csc.noaa.gov/digitalcoast/needs-assessment-guide and the design of future online learning products. This survey is not intended to measure a GPRA performance measure.
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.
Online analytic data for the recently retired needs assessment online training (the most similar online learning product with data) indicates that, on average, there were 1,674 visits/month or 20,088 visits/year (data from 2013/2014). The potential respondents to the proposed information collector are the targeted populations/customers of the Coastal Services Center (local, state and national coastal and marine natural resource management programs and professionals and natural resource management agencies and staff, conservation organization staff etc.). Other respondents include anyone in the universe searching the web for information about the process of needs assessment. The respondent selection method is described in Question 2 below. The collection has not been conducted before. The expected response rate for the collection as a whole is 10%, based on similar efforts conducted in the past.
|
Total Universe based on web statistics |
Estimated responses based on a 10% response rate |
Federal Government |
4,017 |
402 (20% of total responses) |
State Government |
8,035 |
804 (40%) |
Local/Municipal Government |
4,017 |
402 (20%) |
Non-Profit/NGO |
2,008 |
201 (10%) |
Private Sector |
2,008 |
201 (10%) |
Total |
|
2,010 |
2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The survey will only appear for those users who have located and selected this online learning product and who have spent enough time perusing it to have an opinion of its utility. This survey will be embedded in the online Needs Assessment Guide and will be administered using a pop-up window. Users of the website will be sampled based on some predefined degree of interaction or engagement with the site i.e. the survey window will pop-up after the user has completed a specified number of clicks in the site or after a specified amount of time on the site (TBD). The user or repeat-user will have the option to opt-out by clicking a “no thanks” button. Through observation of web statistics from related sites, we are aware that there is often great variation in use from month to month (some months very high, others very low). Because of such unpredictable variation we propose evaluation of the online needs assessment tool for a twelve month period.
3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.
The intent of this information collection is to assess user feedback on the usability and content of the Needs Assessment Guide in order to inform improvements to site functionality and content as well as inform the design of similar online learning products. In order to improve response rates for this information collection, the survey is as brief as possible (i.e. 2 questions), offering quick response options while still providing actionable data. The estimated time necessary for each respondent to complete the 2 questions is 2 minutes, based on trials with a small (fewer than ten) pilot sample. A “thank you" box will pop-up and fade so that the user will know their message has been successfully sent. Nonresponse (selecting “no thanks” will be tracked (if possible), and addressed by modifications that may improve response rate.
4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.
Draft versions of this survey were circulated for review and comment to instructional designers, web developers and program managers. Reviewers were asked to offer feedback on the length, appropriateness and clarity of questions, content, online methodology and other aspects to improve the questions. Comments from reviewers were helpful and resulted in design, and content changes to clarify questions and simplify instructions and offer options. The final survey will be formatted by our web developers after OMB approval. We will conduct an internal beta assessment of the pop-up questions functioning in the Needs Assessment Guide with 4-6 intended users. Any necessary modifications will be made prior to release.
5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
The implementation of the information collection and data analysis will be completed by Dr. Chris Ellis at the NOAA Coastal Services Center, available by telephone at (843) 740-1195 or by email at Chris.Ellis@noaa.gov.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Paige Gill |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |