Survey Interim Report

Survey Interim Report - 92006.doc

State Domestic Preparedness Program - State and Local Survey

Survey Interim Report

OMB: 1651-0101

Document [doc]
Download: doc | pdf




Survey of State and Local Government Emergency Officials

Interim Report

September 28, 2006






Date of Revision

Explanation

TBD

TBD

TABLE OF REVISIONS


This page intentionally left blank.

REPORT TO CONGRESS PREFACE LETTER

SURVEY OF STATE AND LOCAL GOVENRMENT EMERGENCY OFFICIALS REQUIRED BY THE DEPARTMENT OF HOMELAND SECURITY (DHS) APPROPRIATIONS ACT, 2006


State, local, and tribal governments throughout the country will be among the first responders to any terrorist attack, major disaster, or large-scale emergency. Successful prevention, protection, response, and recovery from these potential events requires a harmonization of planning and purpose across all levels of government. Although DHS is charged with the overall coordination of this nation’s security, no national preparedness system can function if State, local, and tribal governments do not collaborate with the Federal level on shared priorities, objectives, and policies.


I am pleased to submit to Congress the Interim Report on the Survey of State and Local Government Emergency Officials as directed by the Department of Homeland Security Appropriations Act of 2006 (H.R. 2360). This report on the specific design of a survey of State and local officials meets the congressional requirements set forth in Section 522 of the Senate version of the Act, and as directed by the Conference Committee on H.R. 2360. In designing the survey, DHS has reviewed its programs which interact with State, local, and tribal emergency officials; which of these officials have regular and frequent contact with DHS; and in what capacity their interaction takes place.


DHS will disseminate the survey in late September 2006. The survey will be distributed to State and local homeland security officials in States and territories to gather information about the user-friendliness and effectiveness of the DHS programs with which they interact. The results of this survey will provide actionable insight into how these crucial State and local emergency officials interact with DHS and pave the way for program improvements. The findings of the survey and specific recommendations will be presented in the Final Report on the Survey of State and Local Government Emergency Officials provided to Congress by December 28, 2006.





Michael Chertoff

Secretary

Department of Homeland Security





TABLE OF CONTENTS

REPORT TO CONGRESS PREFACE LETTER 1

TABLE OF CONTENTS 2

REPORT OVERVIEW 5

SECTION 1: INTRODUCTION 6

Legislative Background of the Survey Project 6

Purpose and Scope of the Interim Report to be Submitted by September 28, 2006 7

SECTION 2: TIMELINE AND PROGRESS 8

Figure 2.1 Project Timeline 8

Description of Completed Tasks 8

Description of Tasks to be Completed for Phase II 11

SECTION 3: DESIGN AND METHODOLOGY 13

The Survey Instrument 13

Sampling Design and Response Rate Sufficiency 14

Survey Dissemination and Collection 21

Data Analysis Strategy 22

SECTION 4: LITERATURE REVIEW – HISTORY AND CONTEXT 25

Satisfaction Surveys: Homeland Security Examples 25

Figure 4.1 Homeland Security Survey Introductory Summaries 25

U.S. Conference of Mayors Homeland Security Monitoring Center (2006) 26

Western Carolina University Institute for the Economy and the Future National Survey of State Homeland Security Officials (2006) 26

Survey Conducted by U.S. Senator Ken Salazar (2005) 27

Figure 4.2 Homeland Security Survey Best Practice Designs and Methodologies 28

Satisfaction Surveys: Other Government Examples 29

Figure 4.3 Other Government Survey Introductory Summaries 29

Consumer Product Safety Commission (CPSC): Website Satisfaction Survey (2005) 29

Department of Defense: Defense Technical Information Center (DTIC) (2004) 30

Department of Agriculture (USDA): Conservation Technical Assistance (CTA) (2001) 30

Figure 4.4 Other Government Survey Best Practice Design and Methodologies 31

United States Government Accountability Office (GAO) Reports 31

Figure 4.5 Applicable GAO Reports 32

Homeland Security: Effective Regional Coordination Can Enhance Emergency Preparedness (GAO-04-1009) 33

Homeland Security: Agency Plans, Implementation, and Challenges Regarding the National Strategy for Homeland Security (GAO-05-33) 33

Homeland Security: Management of First Responder Grant Programs Has Improved, but Challenges Remain (GAO-05-121) 34

SECTION 5: DHS EXISTING DATA ANALYSIS 35

Figure 5.1 Examples of DHS Existing Data that Address Effective and User-Friendly Interaction with State and Local Officials 35

APPENDIX A: HOMELAND SECURITY SURVEYS A-1

U.S. Conference of Mayors Homeland Security Monitoring Center (2006) A-1

National Association of State Chief Information Officers (NASCIO): Information Security Committee (2006) A-2

Western Carolina University Institute for the Economy and the Future National Survey of State Homeland Security Officials (2006) A-2

National Governors Association (NGA) Center for Best Practices “2006 State Homeland Security Directors Survey: New Challenges, Changing Relationships” (2006) A-3

Survey Conducted by U.S. Senator Ken Salazar (2005) A-4

U.S. Conference of Mayors Homeland Security Monitoring Center (2004) A-5

National Governors Association (NGA) Center for Best Practices “Homeland Security in the States: Much Progress, More Work” (2004) A-6

David B. Cohen, Ph.D., et al. “Effective Preparation or Politics as Usual?” A-7

National Association of Counties (NACo) Homeland Security Funding: The Urban Area Security Initiative (2004) A-8

Figure A.1 UASI Funding A-9

National Emergency Management Association (NEMA): State Spending of Homeland Security Funds (2003) A-9

Figure A.2 Status of FY00-FY02 Homeland Security Funds A-10

APPENDIX B: SENATOR SALAZAR SURVEY B-1

APPENDIX C: OTHER GOVERNMENT SURVEYS C-1

U.K. Office of the Deputy Prime Minister – Planning Inspectorate (PINS) “Customer Satisfaction Survey 2005: Final Report” C-1

Consumer Product Safety Commission (CPSC): Website Satisfaction Survey (2005) C-2

Department of Treasury: Bureau of Economic Analysis (BEA) (2005) C-2

State of Delaware Office of Management and Budget (OMB): Government Support Services (2005) C-3

Department of Defense: Defense Technical Information Center (DTIC) (2004) C-3

Department of Agriculture (USDA): Conservation Technical Assistance (CTA) (2001) C-4

Social Security Administration (SSA) Customer Satisfaction Survey (2001) C-4

APPENDIX D: GAO REPORTS D-1

Nuclear Power: Plants Have Upgraded Security, but the Nuclear Regulatory Commission Needs to Improve Its Process for Revising the Design Basis Threat (GAO-06-555T) D-1

Border Security: Continued Weaknesses in Screening Entrants into the United States (GAO-06-976T) D-2

Homeland Security: Overview of Department of Homeland Security Management Challenges (GAO-05-573T) D-2

Transportation Security: Systematic Planning Needed to Optimize Resources (GAO-05-357T) D-2

Homeland Security: Management of First Responder Grant Programs Has Improved, but Challenges Remain (GAO-05-121) D-3

Homeland Security: Managing First Responder Grants to Enhance Emergency Preparedness in the National Capital Region (GAO-05-889T) D-3

Homeland Security: Actions Needed to Better Protect National Icons and Federal Office Buildings from Terrorism (GAO-05-790) D-3

Homeland Security: Management of First Responder Grant Programs and Efforts to Improve Accountability Continue to Evolve (GAO-05-530T) D-4

Homeland Security: Key Cargo Security Programs Can Be Improved (GAO-05-466T) D-4

Homeland Security: Agency Plans, Implementation, and Challenges Regarding the National Strategy for Homeland Security (GAO-05-33) D-5

Homeland Security: Effective Regional Coordination Can Enhance Emergency Preparedness (GAO-04-1009) D-5

Homeland Security: Transformation Strategy Needed to Address Challenges Facing the Federal Protective Service (GAO-04-537) D-6

Homeland Security: Risk Communication Principles May Assist in Refinement of the Homeland Security Advisory System (HSAS) (GAO-04-538T) D-6

Homeland Security: Effective Intergovernmental Coordination Is Key to Success (GAO-02-1011T) D6

APPENDIX E: DHS EXISTING DATA E-1

Training Division Evaluator Database E-1

Homeland Security Grant Program After-Action Conference E-3

State and Urban Area Feedback from the Nationwide Plan Review (NPR) E-8

Mobile Implementation Training Teams (MITT) E-9

Recommendations from Homeland Security Grant Program (HSGP) National Asset Data Base (NADB) E-10

Rural Crime and Justice Center: Nationwide Rural Area Law Enforcement Study – Comprehensive Training Evaluation Report E-11

Figure E.4 Program Subject Areas in the Federal Law Enforcement Training E-14

Figure E.5 Participant Distribution in Nationwide Rural Area Law Enforcement Training E-16

Recommendations from State Homeland Security Assessment and Strategy (SHSAS) Program, Technical Assistance 2004 Conference After-Action Reports (AARs) E-17

Recommendations from the State Homeland Security Assessment and Strategy (SHSAS) Program, Data Collection Tool (DCT) E-18

National Emergency Management Baseline Capability Assessment Program (NEMB-CAP) Progress Report E-20

Transportation Security Administration (TSA)/Office of Intelligence Customer Satisfaction Survey ……………………………………………………………………………………………………………..E-21

Examples of Applicable OMB Measures E-21

APPENDIX F: ACRONYM LIST F-1



REPORT OVERVIEW

The Department of Homeland Security (DHS) strives to interact with State and local government emergency officials in an effective and user-friendly manner. In order to capture the progress and deficiencies in this effort, Congress tasked DHS with the endeavor of formally surveying State and local stakeholders. This effort will illustrate the results DHS has achieved and will identify problematic areas and respective programmatic improvements.


The Final Report on the Survey of State and Local Government Emergency Officials will be delivered by December 28 of this year. To date, DHS has made significant progress in ensuring that the final deliverable will be “thorough and compelling,” per Senator Gregg’s request. DHS recognizes the significance of this effort and will ensure that the initiative proves valuable to the American people. This report serves as a demonstration of DHS status thus far in designing the survey methodology and scope.


The first section, Introduction, describes the purpose and the scope of this effort to survey State and local government emergency officials. It also discusses the legislative underpinnings that served as the impetus for this project. The second section, Timeline and Progress, details work completed towards the Survey of State and Local Government Emergency Officials and next steps to be taken to issue a Final Report by December 28, 2006.


The actual survey this project will utilize to query State and local government emergency officials is presented in the third section, Design and Methodology. The survey is designed to allow for sound, statistically meaningful analysis of responses from a representative sample of homeland security stakeholders at the State and local levels. Also detailed in this section of the report are the individual steps planned for survey distribution and delivery, constructed to encourage a high response rate.


Section 4, Literature Review – History and Context, presents a summary of previous surveys that are relevant to this study. The identification and review of these surveys served two useful purposes: first, they helped to identify best practice methodologies in surveying respondents regarding governmental functions; second, the results of these surveys will provide a contextual landscape for the final results of the survey. DHS researchers also analyzed every Government Accountability Office (GAO) report about DHS to identify State and local issue areas that should be incorporated into the survey design and questions. Appendices A, B, C, and D offer detailed discussions of the individual surveys and GAO reports.


The final section, Existing DHS Data Analysis, summarizes several examples of the type of information that various DHS entities have already collected regarding their interaction with State and local officials. DHS officials interact with stakeholders across the country on a daily basis. Therefore, DHS officials have already begun initiatives to assess and improve their interaction with officials at the State and local level. The pre-existing DHS data helped to inform the design of the survey for this project. Specific information may be found in Appendix E regarding each of the examples of DHS existing data.

SECTION 1: INTRODUCTION

Since the passage of the Homeland Security Act of 2002, the Department of Homeland Security (DHS) has been the Federal government’s primary agency responsible for planning and managing the national response to major disasters, terrorist attacks, and other emergencies. Each day, DHS employees work hard to protect the citizens of the United States. Whether screening airline passengers, protecting the border, or safeguarding critical infrastructure, DHS employees form an important part of the Federal government’s first line of defense against domestic terrorism, natural disasters, and large-scale emergencies.


Since its inception, DHS has understood that domestic preparedness is not only a Federal responsibility. State, local, and tribal governments are on the front lines of domestic security everyday, when they enforce the laws, save lives, and protect citizens. DHS currently includes over 250 entities that interact with State and local officials. Therefore, DHS sends hundreds of its employees into the field daily to work hand-in-hand with representatives of State and local jurisdictions. Whether working towards mutual objectives, training for special capabilities, or funding essential projects, coordination between the Federal government and State and local officials has become an essential part of our nation’s homeland security.


This survey was developed to gain insight into which DHS programs are effective and user-friendly in their interactions with State and local officials. The survey will also solicit and develop recommendations for possible programmatic improvements. The focus of the survey will be on programs relating to the areas defined by the congressional language in Homeland Security Appropriations Act of 2006 (H.R. 2360): grant management, intelligence sharing, training, incident management, regional coordination, critical infrastructure prioritization, and long-term homeland security planning. Responsibility for the survey was assigned to the DHS Preparedness Directorate’s new National Preparedness Task Force (NPTF).


Established on May 30, 2006, the NPTF brings together key strategic planning, exercise, and evaluation assets previously dispersed across the Preparedness Directorate. Since its consolidation, the NPTF has provided strategic planning, project management, and overall preparedness integration for the Preparedness Directorate and in support of all DHS operating components. In particular, the NPTF focuses on developing preparedness doctrine and policy, contingency planning, exercise and evaluation, and preparedness field coordination.


This Introduction section details the legislative background of this survey project and the purpose and scope of this Interim Report.


Legislative Background of the Survey Project

Section 522 of H.R. 2360 as passed by the U.S. Senate reads, in part:


…the Secretary of Homeland Security shall conduct a survey of State and local government emergency officials that involves enough respondents to get an adequate, representational response from police, fire, medical, and emergency planners on the regional, State, county, and municipal levels…and identifies problems relating to the effectiveness and user-friendliness of programs in which the Department of Homeland Security interacts with State and local officials.


H.R. 2360 lists grant management, intelligence sharing, training, incident management, regional coordination, critical infrastructure prioritization, and long-term homeland security planning as the kinds of programs to be included in the survey. The Conference Report (H.Rept. 109-241) to H.R. 2360 directs the Secretary of Homeland Security to comply with this section of the Senate bill. In response to these requirements, DHS has designed a nationwide survey of State, local, territorial, and urban area emergency officials that will fulfill the representational and subject-matter requirements of Congress.


Purpose and Scope of the Interim Report to be Submitted by September 28, 2006

This report is required under Section 522 (b) of H.R. 2360 as passed by the U.S. Senate, which requires an Interim Report on “the specific design of the survey” from DHS. This report contains:


  • A chronology of survey development, including an overall project timeline

  • An explanation of the design methodology of the survey and the analytical processes that will be used to study the results

  • A summary of the background research conducted to date

  • An analysis of pre-existing DHS data pertinent to the project


The majority of the work completed to date has focused on background research, preliminary data collection, and survey design planning. For several weeks, DHS reviewed all DHS programs that interact with State and local emergency officials. Additionally, DHS compiled and analyzed previous surveys of State and local officials. The results of this background research are presented in Section 4 of this report.


On August 11, 2006, based on the list of programs with State and local interaction compiled by the researchers, DHS issued a request to its offices and agencies to submit any data, research, or analysis related to the effectiveness or user-friendliness of its programs. The data returned by the DHS offices formed a crucial part of the survey design process. In designing the report, the NPTF is in the process of involving staff and subject-matter experts from a variety of backgrounds, including the Johns Hopkins University Center for the Study of High Consequence Event Preparedness and Response—one of the Homeland Security Centers of Excellence (COE).


SECTION 2: TIMELINE AND PROGRESS

Figure 2.1 Project Timeline


Description of Completed Tasks

Department of Homeland Security (DHS) researchers divided the project into individual tasks, each of which was given its own timeframe for completion in order to best approach the survey. Though many tasks overlap, each distinct task is summarized below.


Identify DHS Programs with State and Local Exposure

DHS conducted an extensive review of all DHS programs that had relevant exposure to State or local government officials. The programs used to populate this list served as the foundation for the development of the survey.


Filter Program List

In order to more accurately tailor the survey design and analyze future response data, the DHS programs with pertinent State and local interaction were categorized along several lines. These categories were assigned in preparation for survey distribution and in anticipation of eventual respondent data analysis.


  • Type of Interaction: DHS classified programs by types of interaction with State and local emergency officials: direct grants, indirect grants, technical assistance, training, planning, rule-making, and information sharing. A general “Other” category was also included. Many programs had more than one type of interaction. These classifications will be useful in seeking trends in the response data.

  • State and Local Officials with DHS Interaction: DHS classified each program by the State and local emergency officials with whom DHS interacted. The list included: State Administrative Agencies (SAAs), State Homeland Security Advisors, Urban Area Working Groups (UAWGs), and local or tribal jurisdictions. A general “Other” category was also included. Many programs interacted with more than one kind of State or local official.

  • Type of Program: DHS categorized programs by functional area: grant management, intelligence sharing, training, incident management, regional coordination, critical infrastructure prioritization, or long-term homeland security planning.

  • Previous Evaluation and Budget: Finally, researchers identified previous program evaluations, such as through the Program Assessment Rating Tool (PART). The program’s proposed FY07 budget was included in order to establish an appreciation for the size, scope, and resources available to these programs.


Collect Preliminary Data

On August 11, 2006, a request for information was sent to all DHS offices and agencies that had been determined to have applicable State or local interaction. DHS made the request in order to capture known issues to incorporate into survey questions and to identify best practice methodologies. Furthermore, this pre-existing data will help to contextualize survey responses. Hence, the request sought any research, data, or analysis that could be used for the survey project. Requests focused on data that would demonstrate the opinions of State and local officials on the effectiveness and user-friendliness of DHS programs. DHS already houses substantial amounts of pre-existing data on a programmatic level to monitor the effectiveness and user-friendliness of its interaction with State and local government emergency officials. This report will detail some specific examples of the type of data DHS currently utilizes.


DHS researchers also reviewed two categories of third-party surveys. The first set consisted of independent surveys of State and local homeland security officials. Groups such as the U.S. Conference of Mayors, the National Governors Association, National Association of Chief Information Officers, and a small number of university researchers conducted the surveys. These surveys not only provided valuable insight into past and present opinions of homeland security programs, but also assisted in designing the survey to be distributed during this project’s Phase II (the effort to be conducted after submission of this report). Once the responses to the survey have been collected, these background studies will provide a rich context for the results.


The second category of surveys were unrelated to DHS, but were analyzed to provide further insight into survey design and methodology. These surveys were all related to consumer satisfaction of government entities. Examples include a Department of Agriculture survey of its conservation technical assistance programs and a Social Security Administration survey on customer satisfaction with its publications.


Summarized in Section 3, Design and Methodology, this extensive research provided an invaluable independent perspective on DHS programs with State and local interaction, and also helped to form the content and structure of the survey.


Analyze Preliminary Data

DHS component agencies and offices began returning data almost immediately after they received the request for pre-existing information related to DHS programmatic effectiveness and user-friendliness. Early submissions included 10,000 course evaluations from DHS-sponsored training courses for State and local emergency officials. Other submissions included the results of the 2003 and 2004 State Homeland Security Assessment and Strategy (SHSAS) surveys, detailing the emergency capabilities of jurisdictions around the country as well as recommendations for how to improve the homeland security system.


DHS reviewed the data to identify information applicable to the survey project. For instance, DHS identified a sample set of potential respondents through this process, which provided important information that contributed to the survey design. Additionally, researchers found trends that could impact eventual survey responses. Such trends in the data indicate a widely divergent set of opinions regarding the effectiveness and user-friendliness of DHS programs at the State, local, and tribal level. For example, respondents at the State and local level sometimes express frustration with aspects of their DHS interactions, such as limited funding and difficulties with intelligence sharing. DHS incorporated the identified issues into the survey questions.


The data analysis also indicated that State and local officials themselves sometimes disagree regarding the priority DHS should afford different policies. For instance, respondents from urban areas frequently opined that rural communities have a disproportionate impact on regional planning. Rural areas, meanwhile, say that too much attention goes to urban areas, ignoring citizens and critical infrastructure in their jurisdictions. These conflicting opinions demonstrate the difficulty in developing nationwide plans and priorities that address the concerns of all jurisdictions.


Design Survey and Produce Phase I Report

Based on the analysis of the DHS data and the third-party surveys, DHS designed a survey that would fulfill Congress’s participation and subject-matter requirements and account for the issues raised in the analysis of the preliminary data.


The final survey, in its entirety, is located in Section 3 of this report. The 28-question survey is divided into two general sections—one each about the effectiveness and user-friendliness of DHS programs with State and local interaction. The section on effectiveness contains 13 questions; the section on user-friendliness contains 12; and there are three general questions. Respondents are provided with a seven-point Likert Scale on which they can rank their opinions of the effectiveness and user-friendliness of DHS programs. There are also two open-ended sections where respondents can write suggestions to improve both the effectiveness and user-friendliness of the DHS programs with which they interact.


Description of Tasks to be Completed for Phase II

Whereas Phase I focused on designing the survey, Phase II will focus on the distribution and collection of the survey itself, the analysis of response data, and the compilation of recommended programmatic changes to address the issues raised by respondents. The list of tasks remaining to be completed is as follows.


Conduct Survey

Beginning in late September 2006, DHS will begin distributing the survey. In order to determine the contact information for possible respondents, the survey team used a combination of SAA contact information and the Lessons Learned Information Sharing (LLIS) member network. Lessons Learned Information Sharing is a secure, national, online network of Lessons Learned and Best Practices designed to help emergency response providers and homeland security officials prevent, prepare for, respond to, and recover from acts of terrorism. With tens of thousands of emergency response officials already registered, use of the LLIS network not only provides a broad reach for the survey but also encourages further use of a proven homeland security program designed for State and local government emergency officials.


Applicants for the LLIS network are screened and approved to ensure the network is comprised of homeland security authorities and emergency response personnel. Using the thousands of registration files in the LLIS system, DHS researchers compiled a list of State and local homeland security officials to whom the survey will be sent. In addition to allowing quick and easy access to thousands of respondents, members of the LLIS network already have a strong preexisting relationship with DHS. This not only makes them more likely to respond to the survey, but also more likely to provide useful and accurate insight into the programs with which they interact.


The LLIS network is representative of the national population of homeland security stakeholders and extends to all types of officials at all levels of government. For example, included in the LLIS membership are firefighters from rural counties, State-level emergency planners, small-town police officers, and EMS officials from the largest cities in the United States. Furthermore, each group is represented in numbers high enough that an appropriately diverse sample can easily be obtained.


In addition to the LLIS network, the survey team will send the survey to the SAAs. Responsible for their overall statewide homeland security systems, SAAs are an essential part of the homeland security network. Given their small number and their importance to the Federal homeland security structure, the survey team decided to sample all SAAs, regardless of whether they are LLIS members.


Prior to survey distribution, a letter of introduction will be sent to the targeted sample pool via email. This letter introduces the project, requests participation, and allows the survey administrators a preliminary opportunity to verify contact information. This letter can be found in the Design and Methodology section of this report. The survey distribution mechanism utilizes an online instrument which will be presented to survey respondents as a website at www.dhssurvey.org. On July 20, 2006, DHS requested an emergency reinstatement for the State Domestic Preparedness Data Collection Tool (DCT) from the Office of Management and Budget (OMB). This request should satisfy the Paperwork Reduction Act requirements. Respondents access survey questions in a standardized format, eliminating variation in the administration of the survey that could skew results. DHS will make subsequent phone calls to maintain the appropriate proportionality of respondents.


Compile and Analyze Response Data

As survey responses are returned to DHS, researchers will compile and analyze the data. The web-based application will standardize the format of the responses, so that data can be imported into the analysis platform to ensure a cogent data set. Conclusions will be drawn along homeland security functions, demographic delineation, and jurisdictional levels. Effectiveness and user-friendliness will be assessed by subgroup, allowing for identification of problematic areas and potential programmatic recommendations, bridging the gap between survey results and future policy action. These results will be contextualized within the research of background data contained in this report.


Produce Phase II Report

The Final Report, to be submitted to Congress no later than December 28, 2006, will provide an overview of all major trends in the responses, highlight State and local opinions of the effectiveness and user-friendliness of the DHS program areas contained in the survey, and provide potential programmatic changes to address issues identified in the survey.

SECTION 3: DESIGN AND METHODOLOGY

The Survey Instrument

This section contains the survey that will be delivered to the sample population. The survey is designed to address the issue areas required by Congress. Namely, the survey identifies issues related to the “effectiveness and user-friendliness of programs in which the Department of Homeland Security (DHS) interacts with State and local officials, including grant management, intelligence sharing, training, incident management, regional coordination, critical infrastructure prioritization, and long term homeland security planning.”


The survey instrument is comprised of 28 questions that can be answered on a seven-point Likert Scale1. Additionally, two areas are left open-ended and intended to solicit qualitative feedback from respondents. Sound survey development techniques informed the design and will ensure a sound statistical analysis. DHS plans to test the survey prior to the instrument’s launch by vetting it through eligible members of the survey population who are not selected in the sample.


Divided into three sections, the survey first presents questions that ask respondents to assess the effectiveness of various DHS initiatives. The second section seeks feedback about the user-friendliness of DHS programs and activities. The final seeks basic information concerning how much, and in what capacity, the respondent interacts with DHS. Two areas in the survey allow respondents to comment about the effectiveness and user-friendliness of DHS programs in their own words. Please find the complete survey on the subsequent pages.































Sampling Design and Response Rate Sufficiency

Congress has specified that the survey have an adequate representational response from State and local homeland security officials across functional areas and jurisdictional levels. To satisfy this requirement, the survey sample design was constructed to account for the relative proportions of the subgroups of interest (such as law enforcement, fire, medical, and emergency planners). The goal is to generate a sample that will allow valid and defensible inferences to be made between and across the relevant subgroups of the sample, so that the results of the survey can be generalized to the population of State and local homeland security officials at large. The sample design will ensure that the survey results can drive comprehensive and effective policy modeling when addressing any problems relating to the effectiveness and user-friendliness of DHS programs that are identified by the survey.


L

Figure 3.1 LLIS.gov registered user population composition

essons Learned Information Sharing (LLIS.gov) users are the population from which the sample will be drawn because they constitute the best available representation of the national community of emergency response providers and State and local homeland security personnel. With tens of thousands of registered users from across the country, LLIS.gov provides a potential respondent pool that is large enough to capture all relevant subgroups of interest. The user population is sufficiently diversified across functional areas and jurisdictional levels as shown in Figure 3.1.


A stratified random sampling strategy will ensure the sample meets compositional requirements. The proportion of the sample that falls into a category (e.g. fire, law enforcement, or county, municipal) will mirror the percentage of the actual population within LLIS that falls into the same category.


Lessons Learned Information Sharing (LLIS) members can be categorized into jurisdictional levels and functional user subgroups based on self-selected categorizations contained in user profiles.  Comparisons can be made only along the same lines that define the sample strata, so while these data can be used to compare county-level responses with municipal-level responses, it will not be possible to make comparisons between municipalities of varying size.  Given the efforts DHS will be making to increase response rate (discussed below), a sample size of 1,500 adequately meets statistical requirements for data analysis at approximately a 40% response rate.  DHS anticipates exceeding this 40% response rate by a substantial margin. The problem of non-response bias is discussed in the “Data Analysis Strategy” section of this report.


DHS will employ several strategies recommended by the Office of Management and Budget (OMB) to assure adequate response rates, including2:


  • Personally addressed letters of introduction from a senior agency official

  • Multiple modes of data collection

  • A user-friendly response instrument

  • Follow-up phone calls, and emails


While no previous surveys of this type have been conducted of the LLIS population, DHS is confident that its data collection strategies will ensure a high level of response. The primary letter of introduction, included below, is designed to promote awareness and provide critical information about the survey that has been shown to increase participation3 (including time commitments, meaningful motivational language, endorsement be a senior agency official, and a confidentiality pledge). DHS plans to email the survey link to those included in the sample after the letter of introduction has been distributed. DHS will then contact non-respondents via phone and additional emails to provide them with the option of completing the survey over the phone, online, or having a hard copy mailed to them, thus allowing multiple modes of data collection. Potential respondents will have the choice to opt out of the survey exercise. DHS will analyze each follow-up data collection effort to determine if response rates are sufficient. Researchers will repeat this process until maximum possible response rates have been achieved in the timeframe allowed by the study.


Although the survey sample has been designed to adequately represent all relevant subgroups, a comprehensive non-response bias analysis plan is presented in the “Data Analysis Strategy” section of this report.


Survey Dissemination and Collection

The following letter of introduction will be mailed to respondents one week prior to survey distribution.

National Preparedness Task Force

Preparedness Directorate

Washington, DC 20531



Dear [Insert Applicable Name]:


In the next week, the Department of Homeland Security (DHS) will begin surveying homeland security officials and first responders to help assess the effectiveness and user-friendliness of DHS programs. The results of the survey will guide ongoing efforts to improve DHS programs. Feedback from stakeholders across the country will be critical to this initiative.


You have been identified as an active member of the homeland security community and DHS needs your input to understand how we can more effectively meet the needs of those with whom we work most closely. Within the next week, you will receive an email invitation to participate in this survey. The survey should take no more than twenty minutes of your time to complete. If you have any questions at any time during this process please Jonathan Schneller at jonathan.schneller@associates.dhs.gov.


Your responses will be aggregated and reported to Congress in December 2006. No names will be used in the report. The results of the survey will be drawn upon to gauge the usability and effectiveness of DHS programs and to identify areas for potential improvements. We appreciate your support and willingness to participate in this important effort.


Sincerely,

George Foresman

Under Secretary for Preparedness

Department of Homeland Security




The actual distribution of the survey will occur one week later as respondents are invited to visit a dedicated website to fill out the survey. The online collection mechanism will facilitate the collection and standardization of the data and provide immediate feedback on the response rates in each of the subgroups of the sample population. Response rates will be monitored throughout the data collection phase of this process. Surveys collected by phone and other follow-up methods will be used to boost response rates when deemed necessary to maintain the proportionality that is required for the sample to be representative of the entire population and each of its subgroups. All data collection activities will maintain the integrity of the dataset and ensure the validity of the study.


Data Analysis Strategy

The validity of the survey results depends on the composition of the respondent pool. DHS plans to analyze the data for non-response bias before any content analysis takes place.  Given the measures DHS is taking to ensure high rates of response, researchers set the goal of an 80% response rate for all strata within the sample.  In the event that the response rate in any of the identified sample strata falls below 80%, DHS will enact a non-response bias analysis strategy.  As part of this strategy, the Survey of State and Local Government Emergency Officials defines a “non-respondent” as any potential respondent who either cannot be contacted or who chooses not to participate when contacted. 


Using demographic information provided for each potential respondent that exists in the Lessons Learned Information Sharing (LLIS) network, researchers will compare known respondent characteristics to those who did not respond to determine potential sources of bias. First, all of the sample stratification characteristics will be statistically compared to determine whether non-response appears random among these strata.  This step of the process will identify any correlations between the likelihood of participation in the survey and the survey variables being measured.  If strong correlations are found, DHS will take steps to mitigate the associated non-response bias.  Following this initial analysis, a second comparison based on demographic profiles will be used to determine if respondents are statistically different than non-respondents  Should it be determined that the source of non-response is not random, unit weights may be adjusted in the analysis, depending upon the level of non-response rates present in the data.  If it is determined that weighting is not appropriate, a decision regarding the publication of certain data points or comparisons will then be made.


Once the pool of respondents has been analyzed for non-response bias and other potential flaws in the dataset, DHS will begin analysis of the content questions of the survey. General measurements of effectiveness and user-friendliness will be produced once all the survey responses have been collected.  Additional analysis is needed, however, to focus policy choices and to maximize the value of the survey. Data will be analyzed on several additional levels: first, trends in the national community of responders and State and local homeland security officials; second, statistically significant differences in the responses of the different categories included in this study (fire, law enforcement, medical, and emergency planners); and third, differences that emerge with the various jurisdictional subgroups that have been identified (municipal, State, county, regional).  The goal of the analysis is to provide a clear picture of the effectiveness and user-friendliness of DHS programs and to identify specific opportunities for improvement.


Trends at the National Level

In recognition of the broad mission and many different types of DHS programs, the survey has been designed to identify specific opportunities for improvement in each relevant program area (grant management, intelligence sharing, etc.). In the event that the variance of the responses is significant, the data can be further analyzed to search for correlations that may explain variations according to region of the country, population density, or other demographic variables. General trends in effectiveness and usability will be identified to help focus recommendations for improvement, thus reinforcing the link between survey results and policy development.


Analysis of Functional User Subgroups

Analysis of the functional user subgroups will seek to identify any significant differences in perception of DHS programs across homeland security disciplines (fire, law enforcement, medical, emergency planning). The analysis strategy will mirror that of the national analysis, but may also identify significant differences that exist across homeland security disciplines. Similar inquiries will be performed in the event that there is significant variance across or within groups and to determine if the responses that were most frequently offered are truly representative of the sentiments of the whole. Other dependent variables include jurisdiction, time in current position, and DHS workload.


Analysis of Jurisdictional Level Subgroups

Similar analysis will be performed to determine if there is a correlation between the user’s perception of effectiveness, usability, and the jurisdictional level on which they interact with DHS programs. This analysis will help to determine the level on which policy reforms and initiatives must be executed in order to assure that improvement strategies are applied effectively.


All levels of analysis will use accepted statistical analysis techniques such as analysis of variances, calculations of confidence intervals, and chi-squared tests for statistical significance. Data will be presented in a simple and straightforward manner to ensure the transparency of the analysis.































SECTION 4: LITERATURE REVIEW – HISTORY AND CONTEXT

The Department of Homeland Security conducted an extensive review of literature pertaining to the relationship between DHS and State and local government emergency officials. The review yielded several sources for the history and context of the survey: a set of existing surveys of State and local homeland security officials; a set of surveys from other agencies unrelated to DHS, but reviewed for inputs on survey methodology; and several reports from the U.S. Government Accountability Office.


Satisfaction Surveys: Homeland Security Examples

The following section summarizes published surveys conducted to measure effectiveness and user-friendliness of various homeland security efforts within the United States. DHS identified and analyzed these surveys for two reasons: 1) to incorporate effective homeland security survey practices into the Survey of State and Local Government Emergency Officials; and 2) to provide a context for final results in the Phase II Report. DHS extensively analyzed each survey.


This Section contains summaries of this analysis; the full DHS analytical reports can be found in Appendix A. The following table, Figure 4.1, introduces the surveys. The best practice design and methodologies identified for use in the survey are displayed in Figure 4.2.

Figure 4.1 Homeland Security Survey Introductory Summaries

Title

Objective

Takeaways

U.S. Conference of Mayors Homeland Security Monitoring Center

To determine how cities were approaching preparedness issues; assess progress since 9/11/2001; gauge differences between cities of different sizes.

Stratification of sample can show unique trends; make certain to survey all levels of jurisdictions proportionately and adequately.

National Association of State Chief Information Officers Information Security Committee

To identify condition of State cyber security initiatives; assess nature of their relationship with DHS cyber security programs.

Relate issues to objectives; present strategic and tactical level recommendations.

Western Carolina University National Survey of State Homeland Security Officials

To examine data on funding programs, opinions of homeland security priorities, and key pieces of infrastructure.

Different interpretations of homeland security responsibilities on the State and local levels.

National Governors Association Center for Best Practices - State Homeland Security Survey

To gauge the priorities, policies, and governance structure of State homeland security agencies.

Disagreement between State governments over what their role should be in a national homeland security system.

Survey Conducted by U.S. Senator Ken Salazar

To gauge opinions on grant applications process, distribution of intelligence, and regional coordination efforts by DHS.

Demonstrated success in developing regional partnerships; differences exist in the stakeholder opinions of the subgroups within the State.

U.S. Conference of Mayors Homeland Security Monitoring Center

To track the use of homeland security funds sent to States and local jurisdictions.

Showing methodology is essential; include quantitative and qualitative analysis.

National Governors Association Center for Best Practices - Homeland Security in the States

To gauge State homeland security policy, governance, preparedness, coordination, communication, and intelligence sharing.

Interoperable communications still a priority; DHS grants may need to focus more on response.

David B. Cohen, Ph.D., et al. - Effective Preparation or Politics as Usual?

To determine if a strong Federal homeland security system reduces the incentives of State and local governments to work towards their own preparedness.

Officials commit more resources to preparedness as perceived threat increases; larger cities and “conservative” areas more likely to take preparedness seriously.

National Association of Counties - Homeland Security Funding: The Urban Area Security Initiative (UASI)

To track the spending of UASI funds distributed in counties across the country.

Open-ended “how” or “why” questions good for soliciting recommendations; phone follow-up can increase response rate.

National Emergency Management Association - State Spending of Homeland Security Funds

To determine how Federal homeland security funding was administered at the State level.

Use of Likert Scales; governance structure can impact how a jurisdiction perceives its homeland security role.


The surveys reviewed below characterize the types of lessons learned from past efforts. Complete discussions of all surveys included in Figure 4.1 can be found in Appendix A.

U.S. Conference of Mayors Homeland Security Monitoring Center (2006)

The U.S. Conference of Mayors commissioned a study on homeland security and emergency preparedness in order to improve the general homeland security programs of the Federal government and municipalities around the country. The survey, entitled Five Years Post 9/11, One Year Post Katrina: The State of America’s Readiness, sought to evaluate how cities and towns were approaching their preparedness responsibilities, assess strengths and weaknesses of the current system, and gauge whether there were differences in preparedness policies between cities of different sizes.


The U.S. Conference of Mayors conducted analysis by cumulative responses as well as broken down by city population once responses had been received in the study conducted by the Homeland Security Monitoring Center. The U.S. Conference of Mayors cataloged each response according to whether it came from a city with a population below 100,000, between 100,001 and 300,000, or above 300,001 people.


Frame5


The results of the survey reflected progress in overall preparedness since September 11, 2001, with 92% of respondents indicating a five or better on a 1-10 scale of improvement. However, 80% of respondents indicated that they have not received sufficient Federal resources to achieve full communications interoperability, with an average timetable for achieving interoperability at four years. While cities are, on average, confident in their ability to survive on its own in the 72 hours following a disaster, the tables were turned concerning a pandemic flu outbreak. Seventy percent of cities indicated that they would not be prepared to handle a pandemic flu outbreak on its own (the question’s timeframe was in “the first days and weeks”).

Western Carolina University Institute for the Economy and the Future National Survey of State Homeland Security Officials (2006)

Western Carolina University’s Institute for the Economy and the Future conducted a survey to gauge the opinions of State Homeland Security directors and officials as to the quality, effectiveness, and efficiency of State and Federal homeland security programs. Specifically, the survey sought information about funding programs and opinions about homeland security priorities, responsibilities, and key pieces of infrastructure. Respondents answered questions about various homeland security policies, priorities, and programs—both at the State level as well as at the Federal level. The survey was sent to the homeland security advisor and the head of the emergency management division in each State. The survey was conducted by mail, telephone, and Internet, and eventually collected responses from 38 States.


The report found that, although 64% of respondents did not believe DHS’s mission had been clearly outlined, 62% believed their own office’s mission had been adequately defined (at the State level).


Frame6

Survey Conducted by U.S. Senator Ken Salazar (2005)

U.S. Senator Ken Salazar of Colorado added the congressional language calling for this project to the Senate version of H.R. 2360. Beginning in 2005, Senator Salazar distributed a survey to State and local homeland security officials in Colorado. The survey was intended to gauge respondents’ opinions on the grant applications process, the timely and accurate distribution of intelligence, and the regional emergency preparedness coordination efforts of DHS.


Among other relevant issues discussed in Appendix A to this report, the results of the Colorado survey suggested that significant disagreements exist at the State and local levels about what should be DHS’s funding priorities. For instance, rural areas surveyed by Senator Salazar indicated that they believed too much grant money was disbursed to large cities, ignoring the citizens and the critical infrastructure in rural areas. Similarly, some urban areas indicated their belief that rural areas have too much influence setting priorities during regional coordination meetings. Overall, the survey found that “[d]espite intense efforts to improve local security, Colorado’s first responders and emergency officials feel largely unprepared for a major terrorist incident and frustrated by inconsistent direction from the federal government.”4


Frame7


In designing the Survey of State and Local Government Emergency Officials, DHS will ensure that the sentiments of the varied subgroups of stakeholders are accounted for, so that policy initiatives can be targeted to address the concerns of each subgroup. For a complete discussion of all surveys included in Figure 4.1 (above) please refer to Appendix A.


These surveys demonstrated several best practice designs and methodologies to ensure effective survey techniques. The Survey of State and Local Government Emergency Officials incorporates these elements. An illustration of these techniques is provided below in Figure 4.2.

Figure 4.2 Homeland Security Survey Best Practice Designs and Methodologies

Homeland Security

Satisfaction Surveys

Addresses Effectiveness

Addresses User-Friendliness

Random Sample

Stratifies Sample

Different Levels of Government

Programmatic Functions

Neutral Questions



Uses Likert Scale

Includes Recommendations

U.S. Conference of Mayors Homeland Security Monitoring Center




N/A


National Association of State Chief Information Officers Information Security Committee




N/A


Western Carolina University National Survey of State Homeland Security Officials







National Governors Association Center for Best Practices - State Homeland Security Directors Survey




N/A


Survey Conducted by U.S. Senator Ken Salazar






U.S. Conference of Mayors Homeland Security Monitoring Center




N/A



National Governors Association - Homeland Security in the States




N/A



David B. Cohen, Ph.D., et al. - Effective Preparation or Politics as Usual?




N/A


National Association of Counties - Homeland Security Funding: The Urban Area Security Initiative





National Emergency Management Association - State Spending of Homeland Security Funds








Satisfaction Surveys: Other Government Examples

The following section summarizes several published surveys regarding other governmental entities. These surveys all sought to measure user-friendliness within the public sector. As such, they provide valuable insight into existing methodologies and analytical frameworks for measuring user-friendliness of governmental functions. DHS extensively analyzed each survey. This section contains summaries of this analysis; the full DHS analytical reports can be found in Appendix C. The following table, Figure 4.3, introduces the surveys reviewed. The best practice design and methodologies identified for use in the Survey of State and Local Government Emergency Officials are displayed in Figure 4.4.

Figure 4.3 Other Government Survey Introductory Summaries

Title

Objective

Takeaways

U.K. Office of the Deputy Prime Minister - Planning Inspectorate

To measure the levels of satisfaction with groups interacting with PINS; to receive recommendations on how PINS could improve weaknesses.

Action Indicator tool allows assignment of a numerical value measuring priorities for change; allows respondent comments; opinions not otherwise captured made available for analysis.

Consumer Product Safety Commission

To rate the utility and user-friendliness of CPSC website.

Four-point scale eliminates the ability of respondent to have neutral satisfaction level.

Department of the Treasury - Bureau of Economic Analysis

To better understand the needs of customers and help BEA’s economic accounts and services become more user-friendly and responsive.

Tracking customer satisfaction surveys over time allows evaluation of improvement and implementation; effective standardization makes departments comparable.

State of Delaware Office of Management and Budget Government Support Services

To measure vendor performance on central contracts; assess contract effectiveness and efficiency.

Information on specific sampling, distribution and collection methods necessary to analyze results; analysis of survey findings creates accountability and furthers measures for improvement.

Department of Defense - Defense Technical Information Center

To gauge the level of satisfaction among general users and identify possible areas for improving products and services.

Current contact information is essential for feedback; low response rate alters analysis; targeted questions on how services are used; write-in box for further commentary.

Department of Agriculture Conservation Technical Assistance

To measure the customer satisfaction of individuals who had used the Technical Assistance component of the Natural Resources Conservation Service (NRCS).

Effective standardized methodology with numeric entry, direct collection, scaled ratings; user demographics included; specific feedback would be helpful for further improvement.

Social Security Administration (SSA) Customer Satisfaction Survey (2001)

To evaluate the user-friendliness, usefulness and objectivity of SSA analysis, and gather opinions on the focus of SSA’s target issues.

Stratification of respondents; notification and reminder letters for increased response rates; information on program goals.


The surveys reviewed below characterize the types of lessons learned from past efforts. Complete discussions of all surveys included in Figure 4.3 can be found in Appendix C.

Consumer Product Safety Commission (CPSC): Website Satisfaction Survey (2005)

This survey was designed by the CPSC to rate the utility and user-friendliness of its website. Respondents were asked a series of questions designed to determine their ability to find information, navigate the site, the page loading time they experienced, their satisfaction with the site’s design and layout, and the usefulness of information it contained. Furthermore, they were asked to rate their overall satisfaction level on a four-point scale and offer any comments they felt would improve the quality of the site. They were also asked how they heard about the site and how frequently they use it.


An interesting aspect of this survey was the use of a four-point scale. The CPSC survey only allowed respondents to select “Very Satisfied,” “Satisfied,” “Unsatisfied,” or “Very Unsatisfied.” This eliminates the ability of a respondent to be neutral, forcing them to either a generally positive or generally negative assessment of the site.


Frame8


The Survey of State and Local Government Emergency Officials will take special care to construct the range of responses to avoid skewing the results of the analysis by artificially creating preference structures or allowing respondents to answer questions that are not applicable to their experience.

Department of Defense: Defense Technical Information Center (DTIC) (2004)

DTIC conducted a survey to gauge the level of satisfaction among general users and identify possible areas for improving its products and services. Web-based and e-mail surveys were primary collection methods to reach registered customers at DTIC, which comprised a total sample size of 7,901. One-on-one follow-up calls telephone were used to gather contact information in order to increase the response rate, resulting in a total response of 1,317 users (17%). However, the limited calling effort and unresponsive users led to low response rate, greatly decreasing the validity of the results.


Frame9


The Survey of State and Local Government Emergency Officials will include strategies to combat problems arising from outdated contact information, and comprehensive follow-up strategies for boosting response rates to ensure the validity of the final analysis.

Department of Agriculture (USDA): Conservation Technical Assistance (CTA) (2001)

The purpose of the USDA survey was to measure the customer satisfaction of individuals who used the Technical Assistance component in interacting with the Natural Resources Conservation Service (NRCS). The survey was conducted by phone and took place in April 2001. USDA sampled 2,500 CTA recipients from 2000-2001, from which 260 interviews were conducted. Customers were polled on their experience with CTA in questions based on convenience, effectiveness, clarity, ratings of personnel, expectations vs. service, and user interface.


Frame10


The Survey of State and Local Government Emergency Officials will recognize the importance of developing strategies to effectively manage data collection and analysis. The survey response collection mechanisms will be set up to ensure accuracy of the data input process and to ensure that survey responses are attached to relevant demographic information to aid in the analysis process.


From these and other survey reviews, DHS has created the following summary table of best practices methodology.

Figure 4.4 Other Government Survey Best Practice Design and Methodologies

Government Examples of

Satisfaction Surveys

Addresses Effectiveness

Addresses User-Friendliness

Random Sample

Stratifies Sample

Programmatic Functions

Neutral Questions




Uses Likert Scale

Includes Recommendations

U.K. Office of the Deputy Prime Minister - Planning Inspectorate




Consumer Product Safety Commission







Department of the Treasury - Bureau of Economic Analysis





State of Delaware Office of Management and Budget Government Support Services






Department of Defense - Defense Technical Information Center




Department of Agriculture Conservation Technical Assistance




Social Security Administration Customer Satisfaction Survey





United States Government Accountability Office (GAO) Reports

The GAO has produced several reports that include aspects of DHS interaction with State and local government emergency officials. Researchers cataloged and analyzed these reports for takeaways to contribute to the survey design, particularly for guidance on issues that the Survey of State and Local Government Emergency Officials should address.


Overall, the GAO reports demonstrate the depth and breadth of homeland security issues, from transportation and border security to protective services and critical infrastructure protection. GAO reports offer insight into the essential features of each program area, guiding survey design to ensure that questions are relevant and well-designed to assess strengths and weaknesses in each area.


This section includes a table of all relevant GAO reports used to design the survey questions. Several specific examples are included below the table to give the reader an idea of the types of lessons learned from the review. A complete discussion of all pertinent GAO reports is included in Appendix D.

Figure 4.5 Applicable GAO Reports

Title

Objective

Nuclear Power: Plants Have Upgraded Security, but the Nuclear Regulatory Commission Needs to Improve Its Process for Revising the Design Basis Threat

To address the security status at nuclear power and the Nuclear Regulatory Commission's efforts to strengthen the conduct of its force-on-force inspections.

Border Security: Continued Weaknesses in Screening Entrants into the United States

To address border security issues, specifically on the capabilities of Customs and Border Protection (CBP) in determining counterfeit identification by those entering the United States.

Homeland Security: Overview of Department of Homeland Security Management Challenges

To update analysis of the DHS creation process as a high-risk transition area especially in establishing partnerships with stakeholders who are State and local government emergency officials, nationwide strategic planning, and focusing management efforts.

Transportation Security: Systematic Planning Needed to Optimize Resources

To assess the efforts of TSA and DHS in reducing transportation risks while preserving efficiency across all transportation modes.

Homeland Security: Management of First Responder Grant Programs Has Improved, but Challenges Remain

To investigate the ability of the Office for Domestic Preparedness (ODP) to effectively manage its increasing domestic preparedness grant programs.

Homeland Security: Managing First Responder Grants to Enhance Emergency Preparedness in the National Capital Region

To provide recommendations for the NCR, which was described as lacking vital components to enable first responders from its component jurisdictions to collaborate in effective all-hazards prevention, preparation, response, and recovery.

Homeland Security: Actions Needed to Better Protect National Icons and Federal Office Buildings from Terrorism

To provide recommendations to avoid conflicting responsibilities between the Department of the Interior (DOI) and the General Services Administration (GSA) in protecting national icons, monuments, and buildings from terrorism.

Homeland Security: Management of First Responder Grant Programs and Efforts to Improve Accountability Continue to Evolve

To provide the history of the first responder grant programs and identify the need to efficiently distribute and use the grants based on comprehensive preparedness planning.


Homeland Security: Key Cargo Security Programs Can Be Improved

To recommend management strategies to close security loopholes identified in CBP programs.

Homeland Security: Agency Plans, Implementation, and Challenges Regarding the National Strategy for Homeland Security

To review the implementation of the National Strategy for Homeland Security in terms of security initiatives, agency structure, and security challenges since September 11, 2001.

Homeland Security: Effective Regional Coordination Can Enhance Emergency Preparedness

To identify determinant factors that lead to effective regional homeland security coordination and establish how such coordination can be further promoted.

Homeland Security: Transformation Strategy Needed to Address Challenges Facing the Federal Protective Service

To address a new set of challenges for FPS in its transition from GSA to DHS.

Homeland Security: Risk Communication Principles May Assist in Refinement of the Homeland Security Advisory System

To summarize the operations of HSAS and examine the literature on risk communication to consider when, what, and how information should be disseminated.

Homeland Security: Effective Intergovernmental Coordination Is Key

To provide an integrated approach to protect against threats and coordinate resources and authority.


Homeland Security: Effective Regional Coordination Can Enhance Emergency Preparedness (GAO-04-1009)

This report was chartered to determine the factors that lead to effective regional homeland security coordination, and to establish how such coordination could be further promoted. In addition to the several local factors that foster and encourage coordination, the Federal government in general—and DHS in particular—can encourage regional coordination by requiring it as part of its grant process. This is especially true if grant programs allow the regions the flexibility to organize themselves in accordance with their needs.


Frame11

Homeland Security: Agency Plans, Implementation, and Challenges Regarding the National Strategy for Homeland Security (GAO-05-33)

GAO reviewed the implementation of the National Strategy for Homeland Security to determine whether planning and implementation activities of lead agencies address security initiatives, whether the structure of agencies contributes to implementation, and how to identify homeland security challenges since September 11, 2001. The structure of the National Strategy for Homeland Security—a plan to improve homeland security through the cooperation of Federal, State, local, and private sector organizations—organizes a critical array of functions into six distinct “critical mission areas:” intelligence and warning; border and transportation security; domestic counterterrorism; protecting critical infrastructure and key assets; defending against catastrophic threats; and emergency preparedness and response.


The strategy identifies “major initiatives” to be addressed within each of the mission areas and some across mission areas, 43 in all. Although in almost all cases a lead agency was identified for implementation, many also had multiple agencies as leads, with more than three-quarters of the initiatives implemented by three of the six departments reviewed. GAO noted that an improved risk management framework was required for further investment, as well as an improved set of performance measures to gauge progress and results. The overarching National Strategy and Homeland Security Presidential Directives did not, in many cases, define the role of State, local, and private sectors. A major challenge in the National Strategy for Homeland Security will be coordination of these agencies, with congressional oversight to ensure implementation.


Frame12

Homeland Security: Management of First Responder Grant Programs Has Improved, but Challenges Remain (GAO-05-121)

GAO investigated ODP’s ability to effectively manage its increasing domestic preparedness grant programs. In this investigation, GAO addressed how grants for states and urban areas were administered in 2002 and 2003 so that ODP could ensure that the funds were spent in accordance with grant guidance on State preparedness planning.


GAO found that ODP has established grant award procedures for states and localities to improve accountability in State preparedness planning, and that Congress and ODP have made efforts to expedite grant awards by setting time limits for grant application, award, and distribution processes. However, the ability of States and localities to spend grant funds expeditiously is hampered by various legal and procurement requirements.


Frame13


Legal and procurement requirements could potentially be addressed in the user-friendliness content questions within the survey that are directed at grant management. Because the proposed survey is national in scope, DHS could use the information in the GAO reports to focus the survey questions to produce a detailed picture of these problems and design corrective action programs.

























SECTION 5: DHS EXISTING DATA ANALYSIS


DHS strives to interact effectively with State and local government emergency officials. As such, different elements within DHS have established formalized measurements to monitor and improve relations with State and local officials. Researchers identified over 250 DHS entities that have pertinent interaction with State and local officials. This section highlights examples of efforts within DHS to evaluate and improve interaction with State and local officials. The data in this section serve as a representational example of DHS’s consistent effort to assess and improve its interaction with State and local officials.


The case studies were conducted in order capture current DHS best practice efforts to evaluate interaction with State and local officials so that these methods can be reflected in the survey. The detailed case study analysis for each of the examples listed in Figure 5.1 can be found in Appendix E. The results of these DHS efforts will also be leveraged in the Phase II Report recommendations.

Figure 5.1 Examples of DHS Existing Data that Address Effective and User-Friendly Interaction with State and Local Officials

Title

Objective

Type of Data

Takeaways

Training Division Evaluator Database

To evaluate the quality of instructors and improvements in skill among participants.

Course evaluations

Methods of assessing the effectiveness of DHS-sponsored training programs.

Homeland Security Grant Program (HSGP) After-Action Conference

To improve the effectiveness and user-friendliness of the HSGP.

Grant recipient commentary

General user-friendliness and effectiveness recommendations relating to grant guidance.

State and Urban Area Feedback from the Nationwide Plan Review (NPR)

To increase the user-friendliness of DHS interaction with State and local jurisdictions while working to assess capabilities.

Survey respondent commentary

Methods of increasing State and local satisfaction with DHS evaluations.

Mobile Implementation Training Teams (MITT)

To conduct meetings and collect feedback from States, territories and District of Columbia on National Preparedness Goal (NPG) implementation efforts 

Narrative summaries of issues raised at meetings and qualitative analysis of cross-cutting issues  

Possible strategies for broad survey distribution and adequate response rates, user feedback on DHS programs used to design survey content questions.

Recommendations from the HSGP National Asset Data Base (NADB)

To effectively compile data from State and local officials on the critical infrastructure in their jurisdictions.

Critical infrastructure descriptions

Methods of increasing State and local feedback, information sharing, and resources.

Rural Crime and Justice Center (RCJC) Comprehensive Training Evaluation Report

To provide a compilation of law enforcement training program performance for the Nationwide Rural Area Law Enforcement Study (NRALES)

Evaluations of training programs

Three- and six-month follow-up demonstrates impact and usefulness of training. This report was used to devise survey questions on demographics relevant to rural areas, including emergency preparedness officials and programs, knowledge and expectations of programs.

Recommendations from State Homeland Security Assessment and Strategy (SHSAS) Technical Assistance 2004 Conference After-Action Reports (AARs)

To increase the effectiveness and user-friendliness of DHS technical assistance programs directed for State and local jurisdictions.

AARs

Methods of improving DHS technical assistance programs by tailoring them to State and local needs.

Recommendations from the SHSAS Program, Data Collection Tool (DCT)

To examine threats, vulnerabilities, capabilities, and needs of States and local jurisdictions relating to preparedness.

Survey responses

Methods of improving DHS effectiveness in nationwide homeland security planning; improve user-friendliness of DHS data collection.

National Emergency Management Baseline Capability Assessment Program (NEMB-CAP) Progress Report

To assess State-level capabilities according to levels of compliance with Emergency Management Accreditation Program (EMAP) standards. 

Quantitative rates of compliance with 54 standards

The need for simplicity, clarity, and objectivity of questions, possible strategies for broad survey distribution.

Transportation Security Administration (TSA)/Office of Intelligence Customer Satisfaction Survey

To evaluate the effectiveness and user-friendliness of the TSA’s “Weekly Field Intelligence Summary” sent to State and local transportation officials.

Survey responses

Methods of measuring the user-friendliness and effectiveness of intelligence-sharing programs.

Examples of Applicable Office of Management and Budget (OMB) Measures

To improve the effectiveness of DHS programs interacting with State and local officials.

OMB effectiveness measures

Methods of measuring effectiveness and improvement in DHS interaction with State and local officials.













This page intentionally left blank.

              1. HOMELAND SECURITY SURVEYS


Appendix A contains a summary of previous surveys and studies designed to measure the effectiveness of various homeland security efforts within the United States. Taken from a number of organizations, mostly national associations of government representatives dealing with homeland security (governors, mayors, State Chief Information Officers, etc.), the survey compilation enumerates many of the programs and priorities involved in homeland security. These surveys were analyzed for best practices in design and methodology in order to provide background and context for the current Survey of State and Local Government Emergency Officials.

U.S. Conference of Mayors Homeland Security Monitoring Center (2006)

The U.S. Conference of Mayors commissioned a survey to evaluate cities’ and towns’ approaches to preparedness; assess strengths and weaknesses of the current system; and gauge whether differences exist in preparedness policies between cities of different sizes. Analysis of the survey methodology suggests that stratification of the survey sample should reflect the available avenues for future policy action, i.e., if policies can be developed and implemented at several jurisdictional levels, each level should be adequately represented in the survey design to ensure appropriate levels of intervention.


The survey, entitled Five Years Post 9/11, One Year Post Katrina: The State of America’s Readiness, suggests that categorizing data into subgroups based on capabilities (in this case, tied to the available tax base of the population) may be a good way to compare results. This method is applicable to the functional categories within the Survey of State and Local Government Emergency Officials.


The survey was distributed through the U.S. Conference of Mayors organization, reaching cities and towns across the country. In total, 183 cities and towns from 38 U.S. States, the District of Columbia, and Puerto Rico responded to the survey. The survey included ten questions regarding the timeframe for achieving interoperable communications; improving plans and overall preparedness; and coordination with the Federal government. Specifically, respondents were asked whether they had received sufficient funds for communications interoperability among their first responders; whether they had developed mutual aid agreements with nearby military facilities and evacuation plans as a result of Hurricane Katrina; and to indicate their confidence in and the extent of their interaction with the Federal government and its emergency management and domestic preparedness agencies. Responses were analyzed in the aggregate as well as by population, segmented as follows: less than 100,000, between 100,001 and 300,000, or above 300,001.


The results of the survey reflected progress in overall preparedness since September 11, 2001, with 92% of respondents indicating a five or better on a 1-10 scale of improvement. However, 80% of respondents indicated that they have not received sufficient Federal resources to achieve full communications interoperability, with an average timetable for achieving interoperability at four years. While cities are, on average, confident in their ability to survive on its own in the 72 hours following a disaster, the tables were turned concerning a pandemic flu outbreak. Seventy percent of cities indicated that they would not be prepared to handle a pandemic flu outbreak on its own (the question’s timeframe was in “the first days and weeks”).

National Association of State Chief Information Officers (NASCIO): Information Security Committee (2006)

The Information Security Committee of the National Association of State Chief Information Officers (NASCIO) conducted a survey to identify the condition of State cyber security initiatives and assess the nature of their relationship with DHS cyber security programs. NASCIO’s analytical methodology produces recommendations at both a strategic level as well as a more functional, tactical level, paving the way for policy analysis and actionable alternatives for Federal, State, and local officials.


Although the actual survey questions were not listed in the report, the analysis by NASCIO provided examples of analytical methodology. For instance, NASCIO researchers made an effort to identify the objectives of the security programs they surveyed, rather than simply critique them. The NASCIO report also included a set of specific recommendations for how to address many of the issues uncovered by their survey.


Chief Information Officers (CIOs) and/or State-government-wide Chief Information Security Officers (CISOs) from the 50 States and the District of Columbia were questioned on issues such as the protection of critical infrastructure and assets, budget information, relationship with DHS, and staffing for information security offices. The survey was conducted jointly with the Metropolitan Information Exchange (MIX), a national association of county and municipal CIOs. Ultimately, 27 States responded, representing 57% of the total possible recipient pool.


Quantitative findings were included in an appendix to the report. Statistics include the finding that 20 State CIOs (77% of respondents) reported possessing “actionable” information for dealing with “external automated” attacks. The majority of State CISOs (73%) reported having conducted a risk assessment for information and communications technology systems that are “homeland security mission critical” assets. The study concludes that there is poor communication between DHS and State and local jurisdictions, largely because of the lack of a unique role for DHS in cyber security.


Although CISOs would welcome a closer relationship with DHS, the survey indicated that emphasis should also be placed on protection of threats that could be mitigated by groups such as U.S. Computer Emergency Readiness Teams, the Secret Service, the FBI Cybercrime Division, and the private sector. The findings also note that better knowledge of localized academic programs by CIOs would inform efforts to produce competent workers and practical research in information security.

Western Carolina University Institute for the Economy and the Future National Survey of State Homeland Security Officials (2006)

Western Carolina University’s Institute for the Economy and the Future conducted a survey to gauge the opinions of State homeland security directors and officials as to the quality, effectiveness, and efficiency of State and Federal homeland security programs. The survey demonstrates the need to probe multiple jurisdictional levels to develop a complete picture that reflects complexity of all opinions captured by the survey.


The survey indicated a general opinion among State-level homeland security officials that much work needed to be done at the Federal level. Although these officials expressed confidence in their own plans, disagreement between different State officials suggested that some of their frustration with the Federal level may be the result of a Federal homeland security structure trying to accommodate different interpretations of homeland security responsibilities on the State and local level. More research can be done to identify differences in opinion at the State level about priorities and responsibilities. These differences affect the policies of the Federal homeland security system.


The survey was sent to the homeland security advisor and the head of the emergency management division in each State. For some States, both positions were filled by the same official, whereas in other States there was a different official for each role. In the event of the latter, when both officials responded, the response of the State homeland security advisor was given priority and the response of the head of the emergency management division was discarded. The survey was conducted by telephone and Internet. The process elicited responses from 34 States.


Respondents were asked about various homeland security policies, priorities, and programs—both at the State level and at the Federal level. For instance, respondents were asked to evaluate both the effectiveness of Federal grant funding programs and to explain how the funds they received had been allocated. They were also asked about the responsibilities for various homeland security priorities such as prevention and protection activities and intelligence sharing.


The report found that, although 65% of respondents did not believe DHS’s mission had been clearly outlined, 62% believed their own office’s mission had been adequately defined. A majority indicated they believed prevention should be the primary responsibility of the Federal homeland security system, but were divided (43% to 38%) about whether preparation or prevention should be the main goal of State homeland security offices. Over 90% of State officials recognized that rural areas could assist urban areas during national emergencies. A majority of respondents (82%) said they received insufficient funding assistance from the Federal government, and over 26% reported that the funds they had received had not made them better prepared for dealing with terrorism.

National Governors Association (NGA) Center for Best Practices “2006 State Homeland Security Directors Survey: New Challenges, Changing Relationships” (2006)

NGA designed this survey to gauge the priorities, policies, and governance structure of State homeland security agencies, including the relationship of each State’s homeland security office to its State government, the opinions of those officials about the Federal programs, and the coordination levels with the private sector and other State agencies. The survey indicated a disagreement between State governments over what their role should be in a national homeland security system, highlighting the importance of designing an analysis strategy that captures different opinions of the relevant subgroups within a sample population.


All 55 State homeland security directors (including U.S. territories) received the survey. A majority of directors completed and returned the survey between December 2005 and January 2006, generating a 73% response rate. Respondents were asked for their top policy priorities, to name the State official to whom they directly reported, and their opinions on the priorities of the Federal homeland security system. They were also asked for their use and opinion of Federal intelligence, how they have coordinated with local responders, and their relationship to the private sector, the National Guard, and to other State governments.


The survey found that States still listed interoperability, intelligence, and coordination with local agencies as top priorities. Pandemic influenza and natural disasters joined this list.

Survey Conducted by U.S. Senator Ken Salazar (2005)

The Colorado State survey gauged respondents’ opinions on the homeland security grant application process, the timely and accurate distribution of intelligence, and the regional emergency preparedness coordination efforts of DHS. The survey highlighted differences in the responses of urban versus rural State and local officials, and demonstrated that demographic information can be included as a basis for analysis in addition to jurisdictional and functional stratification of the sample.


The survey was divided into three sections: counterterrorism intelligence, grants, and regional emergency response. Question phrasing was direct, as evident in the example below from the survey:


Counterterrorism Intelligence. State and local officials report that they do not get timely, accurate and actionable intelligence from Federal sources. There are currently 15 Federal antiterrorism agencies and at least 12 Federal terrorism watch lists. Local officials often get conflicting data, they get data that has no impact on them, and they are unable to process intelligence to form a complete picture of the threats they face, and what steps they can take. Which terrorism watch lists do you have access to and which do you use on a regular basis? What are the challenges to making these lists useful to you? What are your recommendations for improving these lists?5


Senator Salazar’s survey included responses from more than 60 officials, both at the State and local levels, and included homeland security planners as well as first responders. The survey underscored the State of Colorado’s successes in developing regional partnerships, but also highlighted challenges in areas such as intelligence sharing and interaction with the Federal government on developing policy priorities. Overall, the survey found that “[d]espite intense efforts to improve local security, Colorado’s first responders and emergency officials feel largely unprepared for a major terrorist incident and frustrated by inconsistent direction from the federal government.”6


The survey also suggested that significant disagreements exist at the State and local level about what should be DHS’s funding priorities. For instance, rural areas surveyed by Senator Salazar indicated that they believe too much grant money goes to large cities, ignoring the citizens and the critical infrastructure in rural areas. Similarly, some urban areas indicated their belief that rural areas have too much influence setting priorities during regional coordination meetings.


On July 13, 2005, after reporting the results of his Colorado survey, Senator Salazar amended H.R. 2360 to direct DHS to conduct the survey contained in this report. His objective was “to ensure that our significant investments in homeland security are going to the right priorities and that local officials are getting better direction to guide their efforts.”7

U.S. Conference of Mayors Homeland Security Monitoring Center (2004)

The U.S. Conference of Mayors established a Homeland Security Monitoring Center to closely monitor Federal plans for the distribution of homeland security funds through the States. In January 2004, the organization conducted a survey to track homeland security funds sent to the 50 State governments and local jurisdictions. The First Mayors’ Report to the Nation: Tracking Federal Homeland Security Funds Sent to the 50 State Governments was a study of the flow of Federal homeland security funding to cities through the States.


Mayors and leaders of city governments were contacted during the U.S. Council of Mayors convention in December 2003 and asked to participate in the survey. Ultimately, 215 municipalities responded, including cities and towns in every U.S. State and Puerto Rico. The survey addressed Federal first responder and critical infrastructure programs, State domestic preparedness funding, integration with State planning processes and other aspects of State preparedness efforts.


Results of the survey were compared to the results of a similar 2003 survey in order to provide an overview of preparedness measures taken by cities and local jurisdictions. The comparison also compared grant distribution efficiency and population. Funds for preparedness measures appeared to flow mainly to equipment purchases, with less spent on training, exercise, and planning. Municipal leaders expressed difficulty interfacing with their respective States for both appropriation and planning purposes.


The results of the survey indicated that high percentages of intended recipients had yet to receive any grant money. For instance, 76% of respondents had received none of the $1.5 billion allocated for the Federal First Responder/Critical Infrastructure grant program. Many jurisdictions that had received money felt that the funds would not help solve their most important security priorities. Furthermore, jurisdictions complained that they had not been allowed an opportunity to influence how their funds would be allocated or used. Overall, the survey indicated uneven progress in distributing grant money: some programs, such as training, were being distributed fairly broadly, whereas others, such as many of the grant programs, were taking time to filter down to the municipal level.

National Governors Association (NGA) Center for Best Practices “Homeland Security in the States: Much Progress, More Work” (2004)

The National Governors Association Center for Best Practices conducted a survey to gauge State homeland security policy, governance, preparedness, coordination, communication, and intelligence sharing. The survey demonstrates that interoperable communications are still a priority and that DHS grants may need to focus more on response measures of preparedness, providing insights into program areas that will be incorporated into content questions in the upcoming survey


The survey was distributed by NGA to 55 State and territorial homeland security directors, and was returned by 38 of them—a 69% response rate. It included information on States’ future homeland security priorities. The ranked list of top ten future priorities was as follows:


  1. Develop interoperable communications for emergency responders

  2. Develop a State intelligence fusion center

  3. Identify and protect critical infrastructure

  4. Coordinate efforts of State and local agencies

  5. Improve procedures to receive timely intelligence information

  6. Use exercises and simulations to improve preparedness

  7. Obtain funding

  8. Secure seaports, airports, and borders

  9. Organize State resources for homeland security

  10. Integrate incident command systems


The list indicates that communications interoperability was still an ongoing problem. It also indicated that in spite of State complaints of lack of funding, obtaining more funding was a low priority overall, ranked seventh behind conducting more exercises and improving State and local coordination.


Respondents were asked several questions in each of the survey’s issue areas. For instance, respondents were asked where they fit into their State emergency management and homeland security structure—whether they are in charge of their own department, what that department is tasked with, and to whom they report. Respondents were asked if they had designed exercises to test preparedness, if they had established Continuity of Government (COG) plans, and if they had appropriately passed or amended existing legislation to allow for appropriate quarantine measures. Respondents were also asked about coordination of emergency plans and about the development of emergency communications systems. Finally, respondents were asked to describe their methods of collecting, analyzing, and disseminating statewide intelligence.


The researchers grouped their findings into the five issue areas they originally outlined as the purpose of the report, and were successfully able to draw several conclusions from this analysis. The survey’s findings represented substantial developments in all five issue areas, but also indicated significant hurdles that had yet to be overcome. For instance, all respondents had established statewide operations centers; nearly all (98%) had designed surveys to test agency response plans; 95% had done significant bioterrorism preparedness work; 94% had made mutual assistance agreements with neighboring States. However, 70% of respondents indicated a desire for more Federal funding, and 33% were unsatisfied with current grant guidance. Furthermore, 55% of respondents believed current grant funds unnecessarily underemphasized prevention in favor of response activities. 73% of States had yet to achieve statewide communications interoperability.


David B. Cohen, Ph.D., et al. “Effective Preparation or Politics as Usual?”

The Impact of Institutional Arrangements and Perceived Threats on Homeland Security Policymaking at the State and Local Level” (2004)

Dr. Cohen and others at the University of Akron, Ray C. Bliss Institute of Applied Politics conducted a survey to determine if a strong Federal homeland security system reduces the incentives of State and local governments to work towards their own preparedness. The survey demonstrates the correlation between perceived threat and intensity of local preparedness efforts, highlighting the need to survey high-risk localities as well as those less likely to be targeted, in order to maintain a representative sample.


The researchers created their own measure of government centralization in the States, based on their own studies and on prior research. They compared their conclusions on government centralization with several ten and seven-point scale results from the survey respondents to see if there was any correlation between the extent of government centralization and how homeland security programs operated on the local level. They also differentiated the results based on the city population (three tiers) and on who responded (surveys were sent in groups of three to each State, and depending on the organizational structure different officials responded).


The initial survey was mailed to 588 potential respondents. At the time of the report, researchers had received nearly 200 replies—a preliminary response rate of approximately 34%. Respondents were asked to rate the threat to their communities across the conventional and unconventional weapons spectrum, if and how budget constraints have impacted the homeland security policies, the effectiveness of their jurisdiction in soliciting funds, the priority of homeland security programs in their cities, and the extent of stakeholder involvement.


The findings of the survey demonstrated that, as perceived threats increase, local officials will indeed commit more resources to homeland security programs. The findings contradicted earlier research which suggested that since localities believed the Federal government would intervene in large-scale disasters that local government had few incentives to spend scarce resources on homeland security programs. Second, the findings did not demonstrate that stronger centralized State government inhibited homeland security programs on the local level.


National Association of Counties (NACo) Homeland Security Funding: The Urban Area Security Initiative (2004)

NACo conducted a survey to track the spending of UASI funds distributed in counties across the country. The survey demonstrates the value of open-ended “how” or “why” questions in soliciting recommendations, and that follow-up by phone can increase the survey response rates.


The primary mission of NACo is to ensure that the county government message is heard in Washington, D.C. Its goals are to serve as a liaison with other levels of government, improve public understanding of counties, and advocate and problem-solve with and for counties. In February 2004, NACo produced a survey designed to explore Homeland Security funding—specifically the Urban Area Security Initiative (UASI). This FY03 initiative by the DHS Office for Domestic Preparedness (ODP) was designed to combat terrorism in the United States by targeting Federal funding to the 30 high threat urban areas in 20 States.


The example of NACo’s UASI funding report demonstrates that comment boxes with specific questions are good indicators of improvements that can be made to facilitate State and local cooperation. In order to be most helpful, these suggestions should be made pertaining to a “how” or “why” question. Qualitative answers also require extensive study and thus should depend on the scope of the survey. In this case, the sampling of “core counties,” or targeted areas of high-threat urban areas, was very effective in demonstrating a need for responsiveness, efficiency, and improvement.


Each county was emailed a letter containing an electronic link to the online survey instrument. This correspondence was followed a few days later with a telephone call from a NACo staff member who worked with local officials to collect the responses. Fifteen core counties, or 50%, completed the survey. The responses represented 12 of the 20 States, or 60% of the States with designated high threat areas, and included the counties of San Francisco, San Diego, Los Angeles, Miami-Dade, Dallas, and Washington, DC. It also included the counties that contain Newark, NJ, Memphis, TN, Portland, OR, and Seattle, WA. The responses were tallied into percentages by question, which included topics such as: State-local communication on UASI grants, appropriation of funds towards preparedness and homeland security goals, and level of cooperation between cities and counties. The survey includes nine multiple-choice questions and 3 fill-in-the-blank sections (once for appropriation data, and two comment boxes).


The survey found that 100% of core counties had been well informed by their States’ process for submitting a UASI eligibility plan to ODP. The overwhelming majority also participated in discussions with their States about the distribution of these funds, and 100% had discussed funding with other participating local governments in their high threat area. In terms of preparedness expenditures, the largest percentage of the funds (80%) was requested for equipment.

Figure A.1 UASI Funding


When asked if they had received any of their UASI funds as of the date of their response to this survey, 47% of core counties responded yes, and 53% responded in the negative. UASI funding amounts ranged from a low of $40,000 up to $18.5 million (San Francisco County). When asked what percentage of the anticipated funds they had received, 81% reported receiving from 0 to 25%. The State only appropriated homeland security funds to counties in 47% of cases; one-third of core counties did not know whether their States had appropriated these funds. A majority (73%) of core counties, however, reported using their own general operating funds to enhance homeland security efforts.

National Emergency Management Association (NEMA): State Spending of Homeland Security Funds (2003)

The National Emergency Management Association conducted a survey to determine how Federal homeland security funding was administered at the State level. The survey demonstrates the value of Likert scales and the impact of governance structure on the way a jurisdiction perceives its homeland security role.


The survey was sent without notice to the homeland security directors of all States and U.S. territories, and elicited a 71% response rate. State administrators were asked how much money they had received from various grant programs. They were also asked how much of that money had been spent and on what they had spent it. Furthermore, administrators were asked why money has remained unspent in some cases, and what has contributed to the delays.


The survey emphasized government structure as having an impact on homeland security programs at the local level. Given the emphasis of this survey on perceptions of user-friendliness and effectiveness at the State and local level, measuring government structure may serve as a useful comparison. Furthermore, this survey used large point scales (up to 10) rather than the five point scales that had been used in other surveys.


The findings indicated that although 24% of the funds passed to States between FY00 and FY02 remained unallocated and unspent, 83% of the unspent funding consisted of FY02 funds that had been received only within a few months of the survey. Respondents indicated that the delays in distributing grants were not inappropriate, pointing out that the schedule of the last two fiscal years of grant programs had been sent out on a very tight schedule.


Respondents indicated a desire for a more flexible funding schedule, as well as more flexibility in determining what grant money could be used for. Moreover, respondents indicated a desire to receive grant money for reimbursement of emergency security changes conducted in the months after September 11, 2001, when few Federal funds were available.


The results also indicated problems and delays with receiving the funds on the local level in some cases. For instance, local city councils sometimes had to approve the receipt of Federal funds, and many had not yet done so, further delaying the distribution of grant monies. Furthermore, the survey indicated that the States have, on average, matched or surpassed pass-through requirements each year since FY00. States also remarked that congressional legislation earmarked the funds for equipment, training, and exercises but not for expanding staff, even though State and local responders are undermanned.


Figure A.2 Status of FY00-FY02 Homeland Security Funds

              1. SENATOR SALAZAR SURVEY


MEMORANDUM


To: Colorado Emergency Officials and First Responders

From: U.S. Senator Ken Salazar

Re: Homeland Security Survey: Improving Government Coordination

Date: April 4, 2005



One of my top priorities in Washington is making sure State and local homeland security officials like you have the resources, information and infrastructure you need to do your jobs.


State and local emergency officials represent more than 95% of America’s counterterrorism capability. You are on the front lines of the war on terror. Despite this, you are not getting critical help you need in several key areas.


Congress and the administration have done a lot to improve the situation in recent years, but many challenges remain. Specifically, local officials do not currently have access to:


  • Timely, Accurate and Actionable Counterterrorism Intelligence.

  • A Transparent and Reliable Grants Application Process.

  • Effective Regional Emergency Response Procedures.


I am writing to ask for your assistance and feedback. Too often, lawmakers in Washington develop Federal policy without taking advantage of the expertise and knowledge of people on the ground.


I hope you will take the time to complete the attached survey and share your own experiences with me. Your insight and recommendations will be critical to me as I work to develop solutions to these problems.


I ask you to please return this survey to my office by May 31 to allow me to begin implementing your suggestions this legislative year.


You can submit this form by fax to 202-228-5036 or by mail at 702 Hart Senate Office Building, Washington, DC 20510. Alternately you can fill out the form online at http://salazar.senate.gov/hsq.cfm


Over the next six years, I want to be your partner in Washington. Together, I think we can significantly improve homeland security coordination between the Federal government and State and local officials.



Homeland Security Questionnaire

U.S. Senator Ken Salazar



Name: _____________________________________


Title: _____________________________________


Department: _____________________________________


Address: _____________________________________


_____________________________________


_____________________________________


Phone: _____________________________________


E-Mail: _____________________________________




1. Counterterrorism Intelligence. State and local officials report that they do not get timely, accurate and actionable intelligence from Federal sources. There are currently 15 Federal antiterrorism agencies and at least 12 Federal terrorism watch lists. Local officials often get conflicting data, they get data that has no impact on them, and they are unable to process intelligence to form a complete picture of the threats they face, and what steps they can take.


Which terrorism watch lists do you have access to and which do you use on a regular basis? What are the challenges to making these lists useful to you? What are your recommendations for improving these lists?









What other sources of Federal counterterrorism information (including FBI field offices) do you have access to and which do you use on a regular basis? What has been your experience with these sources? What are your recommendations for improving the flow of information from the Federal government to your office?









Would having more security clearance for classified and top secret antiterrorism documents help you? Which would you prefer, more top-secret clearance or more actionable unclassified intelligence?






What is your biggest challenge to using Federal counterterrorism information? What resources or staff would be most useful to solving this problem?









What are your main points of contact for sharing terrorism information you collect yourself? Do you feel that Federal officials value the tips and information you share? What are the challenges you face in this regard and your suggestions for improving the situation?









Other Comments in Intelligence Sharing?











2. Grants. The Department of Homeland Security has had a number of problems getting homeland security funds to the local departments and projects that need them the most. In the past, the department has allocated billions of dollars without sufficiently focusing on the most vulnerable targets. In addition, the grant application process was cumbersome and divided among six different programs. The Department of Homeland Security made major steps to streamline the grant application process and is trying to better target grants. However these reforms are being made without sufficient input from local and State government officials.



Where do you find out about Federal grant opportunities? Do you feel these information sources are adequate and what would you do to improve them?








Do you have the resources and expertise to write effective grant applications? Are the applications too long and cumbersome? What would you need in order to bolster your ability to apply for grants?








What are the barriers to getting and using grant money in an effective way?








Do you feel that Federal grants are going toward the right priorities and are being coordinated? Do you have specific examples to illustrate your position?








Are you getting enough support to identify and prioritize threats to critical infrastructure in your jurisdiction? What do you need to improve prioritization of threats?








Other comments on grant applications?









3. Regional Emergency Response. Regional emergency response capability is essential to reacting to and recovering from a major terrorist attack. An estimated 85 % of states and 70 % of localities have joined at least one antiterrorism network. However, each network has its own culture, infrastructure, and equipment. Regional organizations are often not properly funded and have little guidance from Federal officials. This situation can result in redundancy, turf battles, and inability to integrate emergency response at the regional level.


What regional partnerships and emergency preparedness arrangements do you participate in?









Do you feel these regional partnerships are sufficient? What are the greatest challenges to your participating in these partnerships and what steps are needed to improve the situation?









What has been your experience with issues of jurisdiction, chain of command and operational control during emergencies that require multi-jurisdictional response? Are there any steps the Federal government could take to make the situation better?








What are the major challenges your department faces in terms of interoperable communications? What equipment and resources do you need to achieve true interoperability?








What are the challenges you face in identifying Weapons of Mass Destruction Attacks? What are the barriers to adequate medical response on a regional basis?









What has been your experience with Federal entities during emergency incidents?

This can include the Federal Emergency Management Agency (FEMA), the Department of Homeland Security’s Office of Emergency Preparedness and Response, the Urban Areas Security Initiative (UASI), the Federal Bureau of Investigation (FBI), the National Guard, the military and other Federal entities.








What kind of guidance have you received from State and Federal sources about how to manage regional coordination?









Other comments on regional emergency coordination?






4. Permission to use information.

We would like to use your responses to inform our legislative process and illustrate the need for reform.


I give Senator Ken Salazar permission to use my responses on this form for legislative, press and other purposes.



Signature Date


              1. OTHER GOVERNMENT SURVEYS


As part of the effort to compile homeland security surveys, researchers also collected and summarized other reports dealing with the user-friendliness of Federal and State agencies. Program functions include technical assistance, economic analysis, and information centers. These programs, which are utilized by other government officials and the public sector, provide insight into existing methodologies and analytical frameworks for measuring user-friendliness of government functions.


The surveys brought insight to methods for the general distribution and collection of the survey. Because response rates are crucial to validity, current contact information and follow-up communication is necessary. Additionally, examples of standardized surveys show the possibility of comparison across departments; or that the same survey given over time has the potential to demonstrate improvement. As with the homeland security survey examples, stratification of users and interagency coordination should be tracked as well as how widely and frequently services are used.

U.K. Office of the Deputy Prime Minister – Planning Inspectorate (PINS) “Customer Satisfaction Survey 2005: Final Report”

The Planning Inspectorate at the U.K. Office of the Deputy Prime Minister conducted a survey to measure the levels of satisfaction with groups interacting with PINS and to receive recommendations on how PINS could improve. The survey demonstrates that the Action Indicator tool allows assignment of a numerical value measuring priorities for change. It also allowed respondent comments, which made opinions not otherwise captured available for analysis.


PINS is tasked “to be the prime source of impartial expertise for resolving disputes about the use of land, natural resources and the environment.” Annual surveys measure the levels of satisfaction with those businesses, agencies, and groups interacting with PINS, making special note to ask for recommendations for improvement. The purpose of this report, then, was to not only evaluate whether U.K. residents were satisfied in their interactions with PINS, but also how PINS could improve its weaknesses.


In order to gather a sample, the PINS researchers collected a set of names and contacts from the PINS database. Every person to whom the survey was sent had interacted with PINS in some capacity in the previous year—in development plans, planning appeals, advertisement appeals, and other initiatives. 4,500 surveys were mailed to people who had been involved in PINS cases between June 2004 and June 2005. 1,535 responded—a 38% response rate. Respondents were asked to select from a list the kind of interaction they had with PINS, which agency publications they had utilized, and their satisfaction with the quality, speed, and clarity of their dealings with the agency. In addition, respondents were provided with a list of specific improvements to the PINS system—such as increased speed and improved quality—and asked to rank them in terms of their priority.


The results of the findings were generally positive: 67% of respondents indicated they were generally satisfied with the PINS process and system. This finding constitutes a continuation of the trend from other recent annual PINS surveys that shows customer satisfaction has increased every year for the past several years. Suggestions for improvement generally focused on improving the speed of various PINS processes, but since satisfaction levels for speed were not generally unsatisfactory, the prioritization tool did not recommend immediate and prompt action as a priority.

Consumer Product Safety Commission (CPSC): Website Satisfaction Survey (2005)

The Consumer Product Safety Commission conducted a survey to rate the utility and user-friendliness of CPSC website. The survey’s use of a four-point scale demonstrates the significance of survey response design in capturing the full range of possible responses.

Respondents were asked a series of questions designed to determine their ability to find information, navigate the site, quantify the page loading time they experienced, establish their satisfaction with the site’s design and layout, and demonstrate the usefulness of information the site contained. Furthermore, they were asked to rate their overall satisfaction level on a four-point scale and leave any comments they felt would improve the quality of the site. They were also asked how they heard about the site and how frequently they utilize it.


T

Figure C.1 CPSC Exposure

he analysis showed that most respondents made their assessment of the site from one or few visits—most having visited the site rarely or only once. Overall, though, the survey yielded positive result: 94% of respondents found the website satisfactory overall. Of those who were dissatisfied, nearly all their issues were results of being unable to easily locate information on the webpage. A majority of visitors (63%) learned about the site from another internet source. 23% heard about it from an unspecified “other” source, and 15% heard about it from television (“The Today Show” did a segment involving the site while the survey was available).

Department of Treasury: Bureau of Economic Analysis (BEA) (2005)

The BEA conducted this survey to better understand the needs of its customers and increase the user-friendliness and responsiveness of its services. This survey demonstrates the value of standardization for interdepartmental comparison, and the value of tracking customer satisfaction over time to identify trends. Similar techniques will be applied to the upcoming survey.


A Customer Satisfaction Survey was mailed to 2,083 users of BEA’s products and services. Customers were polled on their assessment of the quality of products and services, the timeliness and accuracy of estimates, and format and documentation of data. Also, the courtesy, expertise, and responsiveness of staff and user-friendliness of the web interface were assessed.


Of the users BEA contacted, only 202 total responses (169 mailed, 33 online) were received, setting the response rate at 9.6%. This low response rate lessens the validity of the results. However, analysis was conducted in a reliable manner, using American Customer Satisfaction Index (ACSI) scores weighted by category to determine overall customer satisfaction levels. Survey findings included increased customer satisfaction in the areas of data accuracy and timeliness of estimates, but reduced satisfaction in the area of responsiveness.

State of Delaware Office of Management and Budget (OMB): Government Support Services (2005)

The State of Delaware OMB conducted this survey to measure vendor performance on Central Contracts for State of Delaware agencies and to assess the effectiveness and efficiency of contracts in Government Support Services. This survey demonstrates that information on specific sampling, distribution and collection methods is needed to properly evaluate results, and that proper analysis of survey findings creates accountability and furthers measures for improvement. The Final Report for the upcoming survey of State and local officials will include a transparent presentation of survey methodology and analysis techniques.


The Delaware OMB sampled contract users by making the survey available on its website. The survey was also re-sent periodically throughout the contract period and could be returned by e-mail or mail. This survey report did not include information on response rates, findings, and analysis. Its contents included an assessment on the clarity of rules and regulations, timely resolution of problems, competitive pricing, overall performance and knowledge of staff, user interface, and whether customer needs were met.


From the Government Support Services survey by the Delaware OMB, it is noteworthy that information on specific sampling, distribution and collection methods necessary to analyze results. An analysis of survey findings creates accountability and furthers potential measures for improvement.

Department of Defense: Defense Technical Information Center (DTIC) (2004)

The DTIC conducted this survey to gauge the level of satisfaction among users of its users and identify possible areas for improving its products and services. This survey demonstrates the importance of achieving a high response rate and the importance of targeting survey questions to specific services. The survey also demonstrates the effectiveness of open-ended questions to solicit recommendations and other data. Both techniques will be adopted and applied to the survey of State and local homeland security officials.


Web-based and e-mail surveys were primary collection methods to reach registered customers at DTIC, which comprised a total sample size of 7,901. One-on-one follow-up calls telephone were used to gather contact information in order to increase the response rate, resulting in a total response of 1,317 users (16.6%).


Respondents were polled on information needs, utilization and quality of the DTIC, satisfaction with personnel, and preferred communication methods. The results were processed by normalizing the data; converting a five-point scale of satisfaction ratings to a scale and taking a mean of favorable responses. Analytic conclusions were based on overall percentages.


The majority of survey respondents indicated that DTIC is important in accomplishing business objectives, supporting the overall mission, and would recommend DTIC to colleagues. However, the limited calling effort and unresponsive users led to low response rate, greatly decreasing the validity of the results.


The DTIC survey demonstrates that maintaining current contact information is essential in following up with customers. Customer feedback could be integrated into the service itself. The questions were structured, providing information on how long the service has been used, whether the information is provided to self or others, and how widely certain products are used. The survey also included a write-in box for further commentary.

Department of Agriculture (USDA): Conservation Technical Assistance (CTA) (2001)

The USDA conducted this survey to measure the customer satisfaction of individuals who had used the Technical Assistance component in interacting with the Natural Resources Conservation Service (NRCS). This survey demonstrates the usefulness of combining statistical input with cause and effect modeling to produce indices of satisfaction ratings.


The survey was conducted by phone and took place in April 2001. 2,500 CTA recipients within the past year were sampled, from which 260 interviews were conducted. Customers were polled on their experience with CTA in questions based on convenience, effectiveness, clarity, ratings of personnel, expectations vs. service, and user interface. The analysis of this survey combines statistical input with cause and effect modeling (weighted answers) to produce indices of satisfaction. This process is a standard method of the ACSI.


The CTA study included standardized methodology including a questionnaire with numeric answers for computerized entry, direct collection and distribution system, and numbered and scaled ratings useful for statistical analysis. This survey also noted user demographics and type of service used to find relevance in grouped answers. Although results indicate good level of customer satisfaction, a method to receive specific feedback and recommendation would be helpful for further improvement.

Social Security Administration (SSA) Customer Satisfaction Survey (2001)

The SSA conducted this survey to evaluate the user-friendliness of SSA publications, the usefulness and objectivity of the SSA analysis, and to gather opinions as to whether the SSA was focusing its research on the appropriate issues of concern to policymakers and the public. This survey demonstrates the effective data analysis that can be conducted after a thorough stratification of respondents. It also demonstrates good tactics for increasing the response rate of surveys, such as a pre-notification letter that introduces the survey and allows respondents time to prepare for participation. The upcoming survey will apply similar tactics to produce adequate response rates and to ensure thorough stratification of the sample.

Figure C.2 Sample Composition


S ince the SSA deals with so many different kinds of people in different roles, the researchers found it prudent to separate their sample into groups based on what relationship the respondents had to the SSA. Respondents were therefore classified into four groups: “Decisionmakers” were high-level officials; “Subscribers” were people who subscribed to SSA publications; “Non-Subscribers” were generally members of industry groups or related associations who were clearly interested in or did research on SSA matters; and “Stakeholders” were locals who had expressed some kind of interest in the SSA.


The survey sampled 1,800 people in total: 59 Decisionmakers, 889 Subscribers, 512 Non-Subscribers, and 344 Stakeholders. The survey was distributed by mail, telephone, and internet, and gathered 1,043 replies (a response rate of 58%). Respondents were asked for the extent of their interaction with SSA publications. They were also asked to rate their opinion of SSA accuracy, clarity, objectivity, comprehensiveness, and usefulness of research on a five-point scale.


The researchers included a demographic analysis of respondents—even beyond the already-included separation into the four types of respondents. They analyzed the frequency with which the respondents actually had contact with the SSA as well as where they were employed and what background (such as education) they had. Many of the questions included an “Other-Specify” component, and the researchers made an effort to classify each open-ended answer with the predetermined selection choices, even if it meant marking more than one option.


Eighty-six percent of respondents were rated as having a positive opinion of SSA publications and research objectivity. Moreover, such positive ratings were not found to have been based on a small number of interactions. On the contrary, the survey found that half of all respondents had interacted with the SSA at least ten times in the past 24 months. Furthermore, the survey found that one quarter of all respondents were government employees, and a further one-fifth were in higher education. Finally, the survey reported the respondents believed Social Security reform and the future solvency of the Social Security system should be the SSA’s priorities for further research.


Figure C.3 SSA Survey Findings


              1. GAO REPORTS


The U.S. Government Accountability Office (GAO) has produced several reports that include aspects of the Department of Homeland Security’s (DHS) interaction with State and local government emergency officials. These reports were catalogued and analyzed for takeaways to contribute to the survey design, particularly for guidance on issues that the Survey of State and Local Government Officials should address.


Overall, the GAO reports demonstrate the depth and breadth of homeland security issues in which State and local jurisdictions play a prominent role, from transportation and border security to protective services and critical infrastructure protection. DHS is the lead agency in many cases, but also works with a number of other institutions on different levels, including Federal, State, local, and private sector entities. Reports reflect a need for accountability and better management of first responder grant programs, increased regional coordination, and prioritization of security initiatives, including protecting targets and defining threat levels.


The GAO reports assert that current strategies on the Federal level for specific programs need to be assessed and updated to reflect ongoing challenges to security goals. Presidential directives, national security strategy, and national advisory systems establish broad objectives that may create programs and define agency roles, but fail to include guidance on implementation and best practices. The reports generally show that departments are making an effort towards establishing capabilities and carrying out program goals, but are hampered by management issues such as effective grant distribution. Often, initiatives that should cut across agencies are duplicated or suffer from a lack of communication. GAO concludes almost every report with a call to oversight, coordination, establishment of performance measures, and results-based planning for homeland security programs.

Nuclear Power: Plants Have Upgraded Security, but the Nuclear Regulatory Commission Needs to Improve Its Process for Revising the Design Basis Threat (GAO-06-555T)

GAO reported on security at nuclear power plants in conjunction with the efforts of the Nuclear Regulatory Commission (NRC), an independent agency which regulates and oversees security at the plants. The nation’s commercial nuclear power plants are recognized as potential targets in the post-9/11 era for terrorists seeking to cause the release of radioactive material. GAO reviewed the response effort by NRC in April 2003 in order to revise the design basis threat (DBT), which is a description of the threat that plants must be prepared to defend against. The report also addressed actions nuclear power plants have taken to enhance security in response to the revised DBT, and NRC’s efforts to strengthen the conduct of its force-on-force inspections (mock terrorist attacks). In comparison to the September 2003 assessment, inspections are now conducted more frequently and use more updated and realistic attack techniques. Lingering issues included a need for increased levels of security training and expertise among the controllers, and similar security backgrounds. Measures need to be taken to prevent insider attacks and vary inspections. On a broader level, the fundamental issue of whether the DBT represents the terrorist threat remains in question.

Border Security: Continued Weaknesses in Screening Entrants into the United States (GAO-06-976T)

GAO conducted a follow-on report to 2003 and 2004 studies of border security, specifically on CBP’s capabilities in determining counterfeit identification by those entering the United States. As late as 2004, GAO agents found they were able to easily enter the United States from Canada and Mexico using fictitious names and counterfeit driver's licenses and birth certificates. The follow-up investigation of the status of security at U.S. borders sent agents through nine land crossings at northern and southern borders, using fictitious documents created using commercial software available to the public. They gained entry through nine locations across seven different states. These tests revealed that the vulnerability is current and could potentially allow terrorists or other criminals to cross from Canada or Mexico into the United States. Upon conducting a corrective action briefing with CBP, officials acknowledged the inability to identify these false IDs and their support of a new initiative to require all travelers to present passports or other secure identification to enter or reenter the United States.

Homeland Security: Overview of Department of Homeland Security Management Challenges (GAO-05-573T)

The DHS creation process was designated a high-risk transition area by the GAO. In particular, the GAO indicated that DHS would face difficulties establishing partnerships with stakeholders, including State and local government emergency officials, who were more accustomed to working with DHS-component agencies than DHS itself. The GAO also recommended that, in developing its operations plans, DHS solicit employee input, placing special emphasis on those employees serving in the field who interact with State, local, and tribal officials on a regular basis. The report also mentioned the issues facing DHS regarding the definition of inter-governmental homeland security roles and information sharing between all stakeholders.

Transportation Security: Systematic Planning Needed to Optimize Resources (GAO-05-357T)

GAO assessed the efforts of TSA and DHS in reducing transportation risks while preserving efficiency. The GAO found that DHS and TSA have undertaken several programs to achieve this goal, including partnering with Federal agencies, State governments, and the general aviation industry in securing general aviation operations.


The GAO also found that TSA has not implemented a risk-management approach, which previous GAO research has found can help inform decision makers in allocating finite resources to the areas of greatest need. GAO further noted that DHS should no longer rely on restructuring as a solution to the many challenges it faces. Instead, DHS must confront several program issues, including the creation of guidelines for the identification of eligibility requirements for the Transportation Workers Identification Credential, setting quantifiable goals for research and development of strategic plans, and utilizing risk assessments to prioritize these efforts.

Homeland Security: Management of First Responder Grant Programs Has Improved, but Challenges Remain (GAO-05-121)

GAO investigated the ability of the Office for Domestic Preparedness (ODP) to effectively manage its increasing domestic preparedness grant programs. In this investigation, the GAO addressed the process by which grants for states and urban areas were administered in 2002 and 2003 so that ODP could ensure that the funds were spent in accordance with grant guidance on State preparedness planning. GAO also approached the question of how timeframes established by Congress and ODP influenced the grant cycle.


GAO found that ODP has established grant award procedures for states and localities to improve accountability in State preparedness planning, and that Congress and ODP have made efforts to expedite grant awards by setting time limits for grant application, award, and distribution processes. However, the ability of States and localities to spend grant funds expeditiously is hampered by various legal and procurement requirements. ODP is attempting to resolve the situation by identifying best practices to help states and localities effectively address the issue.

Homeland Security: Managing First Responder Grants to Enhance Emergency Preparedness in the National Capital Region (GAO-05-889T)

GAO reported that the NCR lacked vital components to enabling first responders from various jurisdictions to effectively collaborate in the prevention of, preparation for, response to, and recovery from such an attack. In the same report, the GAO made three recommendations that, at the time of this report, had all been partially accomplished.


GAO’s first recommendation was that DHS work with NCR jurisdictions to create a coordinated strategic plan to enhance emergency responder capabilities so that Federal funds may be more effectively directed. DHS reported that a draft of a plan featuring tangible objectives had been completed. The second GAO recommendation was for the DHS to monitor the plan’s execution in order to protect against an indiscriminate use of funds. Finally, he GAO recommended that the DHS address any remaining inadequacies in emergency preparedness and assess how well Federal funds address those failures.


The GAO concluded that implementation of its recommendations would be invaluable in developing the structure, processes, and data needed to assess first responder capabilities in the NCR and in evaluating the success of efforts to improve those capabilities.


Homeland Security: Actions Needed to Better Protect National Icons and Federal Office Buildings from Terrorism (GAO-05-790)

The Department of the Interior (DOI) and the General Services Administration (GSA) are responsible for aspects of the protection of national icons, monuments, and buildings from terrorism. These agencies must ensure public access to the sites while also providing physical security. Though DHS is directly responsible for law enforcement and related security functions at GSA facilities and provides policy leadership on facility protection issues to DOI, the policy papers have identified DOI as the lead Federal entity, in conjunction with DHS, in charge of protecting icons, monuments, and other key assets. Such conflicting responsibilities may create a contentious environment. Consequently, the GAO noted that setting guiding principles through which DOI may balance its core mission with security could be extremely beneficial.


Other organizations, including the GSA, have utilized this tactic to promote transparency in such complex situations; the GSA has created a memorandum of agreement with the DHS regarding security at GSA facilities. However, the GSA still lacks a mechanism with which to coordinate homeland security efforts at its buildings with the Federal Protective Service (FPS) and other agencies. The GAO recommended that the GSA establish a chief security officer position or formal point of contact for this task.

Homeland Security: Management of First Responder Grant Programs and Efforts to Improve Accountability Continue to Evolve (GAO-05-530T)

GAO provided testimony from William Jenkins, Director of GAO Homeland Security and Justice Issues, on the history and evolution of the first responder grants programs: State Homeland Security Grant Program (SHSGP) and the UASI, which channel DHS funding of approximately $10.5 billion from FY02 through FY05. ODP has coordinated grant award procedures with a single application for State and urban areas in six grant programs since FY04. State Administrative Agencies (SAAs) and Urban Area Working Groups (UAWGs) have flexibility in distributing these grants to local jurisdictions and urban areas. Concerns have been expressed by Congress and State and local officials about the time to award and transfer funds.


The report found that due to revised procedures, ODP has demonstrated significant progress since FY03 in expediting grants to states. However, challenges remain in procurement at the State and local level. To ensure accountability for effective use of grant funds, needs assessments were conducted before September 11, 2001 to assist states in developing security strategies. Through fiscal years FY03 and 04, strategies were updated to reflect post-9/11 realities and demonstrate progress on original plans. ODP has also revised how states and jurisdictions report on grant spending and use, shifting away from equipment purchase catalogues to results-based reporting on how expenditures meet preparedness needs. The report reiterated the need to efficiently distribute and use Federal first responder grants in coordination with a plan for funding, based on comprehensive preparedness planning.

Homeland Security: Key Cargo Security Programs Can Be Improved (GAO-05-466T)

GAO conducted a review of potential improvements for the Customs-Trade Partnership Against Terrorism (C-TPAT) and the Container Security Initiative (CSI) programs. These are part of a “layered enforcement strategy” by the U.S. Customs and Border Protection (CBP) which address the threat of smuggled weapons of mass destruction (WMD) into the United States. C-TPAT attempts to facilitate the security of the international supply chain (flow of goods from manufacturer to retailer), as members of the international trade community agree to make improvements in the security of their supply chain with the benefit of reduced likelihood of inspection by CBP. Because connection with the C-TPAT program first requires a self-assessment by private companies of their risk, CBP cannot verify that members have reliable, accurate and effective security profiles. There is also a lack of written program guidance to determine adequacy. Because of staffing constraints, only about 11% of certified C-TPAT members have been validated by CBP in the past three years.


The CSI program addresses the security of oceangoing cargo containers, placing CBP staff in cooperation with foreign counterparts at seaports abroad to assess risk and target and inspect containers that could be carrying WMDs. However, there is a limited ability to successfully determine and target high-risk containers. Staffing imbalances abroad mean that CBP officials cannot screen all containers shipped from CSI ports before they depart for the U.S. GAO recommends developing a management strategy to close these security loopholes: specifying program performance goals and evaluation measures, identifying external factors that affect program goals, and revising staffing models and other capability requirements.

Homeland Security: Agency Plans, Implementation, and Challenges Regarding the National Strategy for Homeland Security (GAO-05-33)

GAO reviewed the implementation of the National Strategy for Homeland Security to determine whether planning and implementation activities of lead agencies address security initiatives, whether the structure of agencies contributes to implementation, and how to identify homeland security challenges since September 11, 2001. The structure of the National Strategy for Homeland Security—a plan to improve homeland security through the cooperation of Federal, State, local, and private sector organizations—organizes a critical array of functions into six distinct “critical mission areas”: intelligence and warning, border and transportation security, domestic counterterrorism, protecting critical infrastructures and key assets, defending against catastrophic threats, and emergency preparedness and response.


The strategy identifies “major initiatives” to be addressed within each of the mission areas and some across mission areas, 43 in all. Although in almost all cases, a lead agency was identified for implementation, many also had multiple agencies as leads, with more than three-quarters of the initiatives implemented by three of the six departments reviewed. GAO notes that an improved risk management framework is required for further investment, as well as an improved set of performance measures to gauge progress and results. The overarching National Strategy and HSPDs do not in many cases define the role of State, local, and private sectors. A major challenge in the National Strategy for Homeland Security will be coordination of these agencies, with congressional oversight to ensure implementation.

Homeland Security: Effective Regional Coordination Can Enhance Emergency Preparedness (GAO-04-1009)

GAO reviewed potential factors that may lead to effective regional homeland security coordination, and to establish how such coordination could be further promoted. In addition to the several local factors that foster and encourage coordination, the Federal government in general—and DHS in particular—can encourage regional coordination by requiring it as part of its grant process. This is especially true if grant programs allow the regions the flexibility to organize themselves in accordance with their regional needs. The report cited Federal transportation and Urban Area Security Initiative (UASI) grants as examples of programs that emphasize regional coordination as part of their grant management policies. In particular, the report cited the National Capital Region (NCR) as needing specific planning and grant incentives to promote its regional coordination.

Homeland Security: Transformation Strategy Needed to Address Challenges Facing the Federal Protective Service (GAO-04-537)

GAO evaluated the efforts of the FPS to adapt to the organization’s new responsibilities outline in the Homeland Security Act of 2002. The FPS plays a critical role in the Federal government’s defense against the threat of terrorism and other criminal activity. Since being transferred from the GSA to DHS in March 2003, FPS faces a new set of challenges, including how to adapt to its increasing set of responsibilities. One such added responsibility is in the law enforcement sector. The 2002 Homeland Security Act gave FPS additional law enforcement authority, which allows officers to enter into agreements with State and local law enforcement personnel to carry out activities that promote homeland security. It also empowers officers and special agents to take action in order to protect the public, even if the incident occurs off of Federal property. FPS officers noted that this added authority will make them more effective in promoting security and protecting facilities, and will allow them to be more involved in intergovernmental activities such as biological and chemical weapons response training.

Homeland Security: Risk Communication Principles May Assist in Refinement of the Homeland Security Advisory System (HSAS) (GAO-04-538T)

GAO summarizes the operations of HSAS, established in March 2002 to disseminate information regarding the risk of terrorist acts to Federal, State, and local government agencies, private industry, and the public. GAO also examined the literature on risk communication to consider when, what, and how information should be disseminated about threat level changes. This review showed that appropriate use of threat bulletins can inform the population and increase preparedness. If used poorly, warnings can inform terrorists to change tactics and raise general anxiety.

The type of information currently provided to Federal, State, and local agencies, private industry, and the public regarding terrorist threats includes threat level and general information on why the level was raised. They did not generally include locations of potential threats or timeframes. The new advisory system caused recipient entities to express the need to be informed on appropriate responses to heightened alerts. As well as receiving general suggested protective measures for Code Orange alerts by the American Red Cross, DHS, and others, Federal, State, and local officials have expressed a need for specificity of threats in order to prepare the appropriate response. Federal agencies would also find recommended measures on incident prevention to be helpful.

Homeland Security: Effective Intergovernmental Coordination Is Key to Success (GAO-02-1011T)

Given the challenges posed by homeland security by any one level of government (Departments of Justice, Transportation, Health and Human Services), an integrated approach is essential to protecting against threats and coordinating resources and authority. This report is testimony by GAO Strategic Issues director Patricia Dalton, given in response to the proposal of a consolidated DHS. Dalton draws on national security strategy documents and interviews of State emergency management and officials in five cities (Baltimore, Denver, Los Angeles, New Orleans, and Seattle) to focus on the elements needed for effective implementation: establishing a leadership structure, defining the roles of different levels of government, developing performance goals and measures, and using the appropriate tools to best achieve and sustain national goals. The proposal for a statutorily-based DHS will “capture homeland security as a long-term commitment grounded in the institutional framework of the nation’s governmental structure” and “ensure legitimacy, authority, sustainability, and the appropriate accountability to the Congress and the American people.”


              1. DHS EXISTING DATA

The following examples of existing data analysis in DHS include evaluations and commentary in the form of strategy assessments, after-action reports (AARs), and other feedback from State and local government officials. Agencies involved in homeland security and grant guidance such as the Transportation Security Administration (TSA), Office of Management and Budget (OMB), and the various programs within DHS have their own measures of monitoring Federal, State, and local interaction in such activities as grant distribution and technical assistance. The information included here illustrates existing evaluation methodology that can be used in the survey and Final Report recommendations.

Training Division Evaluator Database

DHS offers training classes and programs to educate and train first responders and other State and local officials on planning, preventing, responding, and recovering from Incidents of National Significance. As part of the standard curriculum, participants were asked to fill out a course evaluation survey upon completion of a preparedness training course. The survey was 22 questions in length and prompted participants to rate the class, its instructor, the benefit they received, the overall quality of the course, and their knowledge, skills and abilities both before and after course completion. Responses are collected via an online collection tool and stored in a trainer evaluation database. Researchers analyzed the database to assess the effectiveness and user-friendliness of the training programs.

Fig. E.1 Training Courses by Preparedness Area


The database contains information for preparedness courses administered in 40 States from October 2005 to August 2006. In total, 36 separate course curriculums were offered (although some courses were offered at multiple times and in different locations). Courses addressed three general preparedness areas: performance, management, and awareness. The database contains responses for over 6,000 participants who submitted course surveys from 242 preparedness classes administered during the survey period.


Courses offerings were geographically concentrated in the areas of the country that are most frequently confronted with emergency situations. For example, 45% of the training courses in the Evaluator Database were offered in Federal Emergency Management Agency (FEMA) Region 4, which is frequently battered by natural disasters.

Figure E.2 Percent of total number of courses offered, displayed by FEMA region






Database information was analyzed to study the effectiveness of the training programs. Participants generally reported an increase in Knowledge, Skills, and Abilities (KSA) level after completing the course. Participants were given the following options to describe their KSA levels: None, Little, Basic, Intermediate, Advanced and Not Applicable. The most frequent KSA self-rating before participation in the training courses was “Basic”, while “Intermediate” was the most frequent self-assessed KSA level after course completion. This result holds for all three preparedness training areas, and very few of the participants rated themselves at a KSA level of “Basic” or below upon completion of the courses.


The database also brings a self-analysis capability that flags courses with low ratings. To be flagged, a course or its instructor must receive poor ratings from more than 33% of the class in at least 5 of the 22 questions included in the survey (courses may also be identified if more than 33% of the class gives themselves a KSA rating of “Intermediate” or “Advanced”). Of the 262 classes offered during the survey period, only one was flagged, indicating that participants were generally satisfied and found the class useful. With very few exceptions, participants agreed or agreed strongly that their course was relevant, practical and appropriate for their level of experience.












Figure E.3 Training Evaluator Database

















Graphic: Graph displaying the change in KSA level for management courses included in the database, similar relationships exist for performance and awareness courses included in the database.



The data generally support the conclusions that training courses effectively communicate information, increase the KSAs of the participants, and are accessible and easy to use for the intended audience.


Homeland Security Grant Program After-Action Conference

The HSGP serves as DHS’s primary means of providing homeland security assistance to State and local communities.  The HSGP After-Action Conference was held in San Diego, California on July 11 and 12, 2006, to provide an open forum through which to generate feedback from participating State and local partners regarding the FY06 HSGP process, as well as ideas for future improvements.  The conference, which was comprised of approximately 130 State and local representatives from 46 States and territories, was divided into four working groups that addressed portions of the FY06 HSGP progress: homeland security planning, HSGP guidance and application, effectiveness analysis, and risk analysis.


Homeland Security Planning

DHS has emphasized homeland security planning by utilizing a common planning framework, including the State and Urban Area Homeland Security Strategy and the Program and Capability Review, and developing the Enhancement Plan and the HSGP application’s Investment Justification.


State and Urban Area Homeland Security Strategies create a shared framework through which preparedness programs may be assessed within and across State boundaries.  They also provide a foundation for homeland security planning focused on Federal, State, and local priorities.  In 2005, states and urban areas were required to update their strategies to align with the Interim National Preparedness Goal and the national priorities. This represents the first integral step in better connecting the FY06 HSGP cycle to the Interim National Preparedness Goal. 


The homeland security program is analyzed by the Program and Capability Review (PCR), a process for discussing and evaluating the homeland security program and its component activities. The PCR emphasizes an enterprise-wide, multi-disciplinary, and multi-jurisdictional approach to State preparedness planning.  States were encouraged to leverage existing resources, including the State Homeland Security Strategy, current State and local plans, and grant data, providing a starting point to conduct the PCR.  As a conclusion to the PCR, states created the Enhancement Plan, a management project designed to enumerate the initiatives necessary to sustain preparedness, mitigate security weaknesses, and outline spending priorities for future spending.


The following ten recommendations were made for homeland security in FY 2007 and beyond:


  1. Though State and local partners concurred that the FY06 planning process was the most successful to date, their first recommendation was to improve the HSGP process while maintaining its basic structure.

  2. Building upon the previous recommendation, State and local partners suggested that specific guidelines be set for the integration of the added requirements with each other, and with the preexisting requirements.

  3. State and local leaders disclosed their desire to receive notification from DHS of new requirements well in advance.  Participants recommended that DHS create a fixed calendar and provide advance notice regarding upcoming changes and requirements.  Such an organized timeline would provide State and local leaders with adequate time to organize their capabilities and to create innovative application requirements, such as the Enhancement Plan and Investment Justification.

  4. Participants recommended that DHS not increase the current size of the Target Capabilities List (TCL).  State and local partners agreed that maintaining a limited TCL allowed for standardization of assessment across the country. Conversely, increasing the TCL would unnecessarily spread resources across the capabilities and decrease the effectiveness of TCL reviews.

  5. Participants recommended that DHS to develop a fixed planning cycle for State and local jurisdictions.  This cycle would consist of a risk assessment and a capability assessment, updating the Homeland Security Strategy and the Enhancement Plan, translating all planning tools into the Investment Justification, and reintroducing grant awards into the Enhancement Plan and Investment Justification.

  6. State and local partners concluded that, although they can conduct their own risk assessments, they need additional assistance from DHS in identifying how to manage and mitigate that risk.  Participants recommended that DHS provide a standardized risk assessment methodology so that the individual states and jurisdictions may successfully manage their own risks and provide DHS with conclusive information, rather than having DHS dictate their risks.

  7. Participants recommended that DHS to provide states and local jurisdictions with technical support programs, such as assistance with strategic planning and with completing the Investment Justification.  State and local partners also noted that training programs created by DHS would be useful in familiarizing new hires with DHS programs, processes, and procedures.

  8. State and local partners noted the substantial overlap of the State and Urban Area Homeland Security Strategies, the Enhancement Plan, and the Investment Justification.  They subsequently recommended that the commonalities be identified and the models be updated to eliminate overlap and create more effective linkages among the three documents.  In this way, it was concluded, the documents may be more effectively utilized for determining funding allocation.

  9. Participants recommended that State and local planners be allowed to present their own State and Urban Area Homeland Security Strategy to the DHS Strategy Review Board in order to provide additional clarification for review board members. 

  10. Participants wanted DHS to encourage States to engage in regionalization.  Specifically, State and local partners concluded that increased incentives would likely increase the probability that states would voluntarily engage in interstate activities, thus leading to regionalization.


HSGP Guidance and Application

The FY06 Investment Justification explained how the initiatives specified in the Enhancement Plan were to be carried out.  The four high-level sections of the Investment Justification (Background, Regionalization, Impact, and Funding and Implementation Plan) addressed how well each State tackled the specified initiatives.  Each State and urban area was allowed to propose 15 investments, based on the proposals set forth in the Enhancement Plan.  All states and urban areas were encouraged to collaborate in this effort to guard against duplicated efforts and to ensure a cohesive approach.  Many states and urban areas were able to combine several projects into one investment, and numerous investments into one overarching investment. 


The following eight recommendations were made in the area of HSGP guidance and application for FY 2007 and beyond:


  1. DHS should change the guidance structure, moving some information to the appendices.  State and local partners specifically noted the existence of reference materials in the HSGP Guidance and Application Kit; though this information may be useful, they agreed that it is extraneous for the applicants that will be reading it, and could cause unnecessary confusion.

  2. DHS should provide grant guidance in both Adobe Acrobat® and Microsoft Word® formats in order for participants to be able to easily cut and paste sections of the documents.

  3. Participants noted that they had prepared their applications based on the most current guidelines made available to them by the DHS.  However, the DHS released additional guidelines just one month before the application deadline.  State and local partners recommended that application scoring criteria and guidelines be made available in the program guidance and application kit.  In this way, participants may better prepare their applications ahead of time.

  4. State and local partners remarked that the tool used to create the Investment Justification did not allow them to conduct a spell check or to add graphics to the document.  Such functionality insufficiencies made completion of the document an unnecessarily time-consuming task. It was therefore recommended that DHS ensure sufficient functionality.

  5. Introduce a page limit to be used instead of a character limit for each Investment Justification.  In this way, individual jurisdictions may follow set guidelines while simultaneously retaining the freedom to decide how much detail to include in each section.

  6. State and local partners concluded that including 17 questions in the investment justification template was excessive.  They suggested instead that the questions be organized into five areas: Background/Scope/Scalability of Investment, Impact, Funding Plan, Long-Term Plan/Institutionalization, and Regionalization (including tribal and international partners).  Participants also requested that DHS provide definitions for the terms used throughout the Investment Justification in order to promote consistency.

  7. Participants suggested that a section be added into the Investment Justification to illustrate its direct linkage to the Enhancement Plan.

  8. Finally, participants suggested that DHS facilitate increased communications with preparedness officers during the grant application processSuch openness in communications would allow for ease of clarification and assistance on the application.


Effectiveness Analysis

The success of the proposed improvements was scored by a set of peer reviewers who took into account the effectiveness of the individual investments, as well as the overall effectiveness of each State and urban area’s program.  The final scores were then combined with DHS risk analysis scores to determine final HSGP allocations.  This process created incentives for states and urban areas to effectively leverage HSGP funds in order to successfully achieve the previously set security program objectives. 


State and local partners concluded that the review process was effective. The structure of the panel created a workable environment in which to evaluate the applications, and the experience and expertise of the peer reviewers allowed for valuable analyses. 


The following 11 recommendations were made for effectiveness analysis in FY07 and beyond:


  1. Maintain the simplicity of the generally successful process.  In order to ensure that the focus of the Investment Justifications is on content, not format, the program will need to be streamlined, though the overall structure should remain constant. 

  2. Improve the questions and the scoring process.  State and local partners noted confusion among evaluators regarding the value of the numerical scores. It was therefore recommended that reviewers receive detailed instructions and time to ask questions regarding scoring, and that a qualitative score be given in addition to the numerical score in order to avoid confusion.

  3. Reformat the Investment Justification to allow for the addition of a budget narrative, as well as general increased flexibility.  A budget narrative should be included alongside the Investment Justification to allow participants to import and export data in the Investment Justification template.  Finally, the Excel® format proved to be inflexible. A budget in Word® format would be more accommodating.

  4. State and local partners remarked that the given timelines were impractical, as Investment Justification deadlines oftentimes coincided with numerous other DHS deadlines, thus leading to substandard work.  It was therefore suggested that the timelines be modified to allow participants to complete high-quality work within a realistic timeframe.

  5. Eliminate the overall Investment Justification scores.  Participants concluded that this general score was not representative of the true quality of the application.  They additionally suggested that reviewers would be able to provide better feedback if the components of the Investment Justification were integrated as questions in the individual investments. 

  6. DHS should provide additional guidance to peer reviewers in order to assist them in making more useful comments.  Increased transparency was proposed, as it was oftentimes difficult for peer reviewer comments to be easily accessed.

  7. Clearly communicate the relative importance of the effectiveness analysis in the beginning of the process, and weigh the effectiveness analysis more heavily in the funding allocation process.

  8. Include an AAR to highlight lessons learned about the peer review process from a peer reviewer perspective.  Such a report, which would provide the basis of a technical assistance program, would lead to consistent, high-quality investments in the future. 

  9. There were several positive aspects of the Investment Justifications, including the composition of the panels, the number of Investment Justifications assessed by each reviewer, the number of investments in each Investment Justification, and the variety of subject matter experts.  The recommendation was to maintain this successful balance.

  10. The vast differences between the specific needs of urban areas and those of the encompassing states were apparent to the participants. It was therefore recommended that a separate Enhancement Plan be drawn up for each urban area, and that it be included as an appendix to the State’s Enhancement Plan. 

  11. Allow State and urban area representatives to be present during the process so that they may clarify components of their application and receive first-hand advice on what can be improved for the future.


Risk Analysis

The FY06 risk methodology, which provided the most accurate assessment to date of the relative threat of terrorism for various jurisdictions, included several improvements from previous years.  These enhancements included integration of strategic threat analyses from the intelligence community, increased scope in critical infrastructure and key asset data, encouragement of regionalization through inclusion of populated areas outside official city limits, and inclusion of transient populations, such as commuters and tourists.


State and local partners promoted a risk-based approach to national preparedness, and provided the following three recommendations to DHS for FY07 and beyond:


  1. Present detailed in-person briefings to State and local partners regarding the foundational elements of the risk methodology used in the FY06 process.  These briefings will clarify the process for State and local leaders, provide them with an opportunity to ask questions about the risk methodology, and allow them to translate the acquired knowledge to other representatives within their jurisdictions.

  2. DHS should establish a working group of Federal, State, and local representatives to assess the geographic characteristics and asset types used in the risk methodology.  Some important issues to address include population density and international borders, as well as how to successfully factor in risk reduction, natural hazards risk, the NPR, and the National Infrastructure Protection Plan. 

  3. DHS should utilize the input of State and local representatives during the validation of information used in the risk analysis process.  Access to the specific list of assets used in the FY06 risk analysis would provide State and local representatives with a better understanding of what infrastructure is affecting their allocations.  


Next Steps

DHS will continue to strive for improvements in the HSGP process.  State and local partners remarked that the FY06 process provided a strong foundation on which to build for future fiscal years.  The process created an open forum for frank discussion of new ideas, and was invaluable in opening lines of communication among State and urban area representatives and DHS.  State and local partners appreciated the efforts of DHS to ensure that security planning continues to be a collaborative process, as national security is a shared responsibility.


State and Urban Area Feedback from the Nationwide Plan Review (NPR)

In the wake of the 2005 hurricane season, the President and Congress launched the NPR—a comprehensive assessment of the nation’s emergency operation plans to determine their ability to address a major catastrophic event. DHS released the NPR Phase 1 Report in March 2006 and NPR Phase 2 Report in June 2006. The reports examined the plans of 56 States and territories and the 75 largest U.S. urban areas. In Phase 1, States and urban areas submitted self-assessments of their emergency operations plans (EOPs), focusing on their ability to manage a large-scale emergency. Each jurisdiction was required to complete an open-ended questionnaire as well as a qualitative question matrix and return it to DHS. In Phase 2, DHS employed Peer Review Teams consisting of 77 former State and local homeland security and emergency management officials to visit the States and urban areas, review and validate the self-assessments, and help determine requirements for Federal planning assistance. At the conclusion of each visit, the Peer Review Team completed a comprehensive report and submitted it to DHS.

State and local feedback from the NPR generally concerned aspects of the review process the local jurisdictions felt ought to have been done differently. For instance, several participants criticized the public release of the findings, since they had been led to believe the evaluation would not be disclosed. This was especially pertinent since participants worried that the findings could be misconstrued by the public or the media. Other participants recommended that future evaluations be given more time to be completed, ensuring that no jurisdiction’s evaluation was rushed due to lack of time.

The experience with the NPR provided several lessons for the researchers designing the survey contained in this report. First, it is important to ensure that respondents are aware that, although they will never be named, their invaluable responses will contribute to a report that will be released to the public. Second, it is essential that respondents be given enough time to thoughtfully and thoroughly complete the survey. These lessons formed an important part of the survey design process, helping ensure that State and local officials are as satisfied as possible with the survey process.


Mobile Implementation Training Teams (MITT)

As part of the National Preparedness Goal (NPG) mandated by Homeland Security Presidential Directive-8 (HSPD-8), DHS created the MITT program to provide a conduit for DHS to give guidance and receive feedback from the States and local jurisdictions charged with implementing the NPG.


In April 2005, DHS began widely disseminating information relevant to the NPG and its implementation requirements. DHS management personnel orchestrated several “roll-out” conferences around the country that included in-depth presentations on pertinent HSPD and NPG related information. These conferences laid the groundwork for the MITT program to provide a more personalized forum for State-specific groups of homeland security officials to express key issues, concerns, and recommendations for implementing the NPG.


The goal of the MITT program was for DHS representatives to meet with State advisors and homeland security officials from each of the 50 States, five U.S. territories and the District of Columbia to conduct facilitated discussions about NPG implementation. Project coordinators formed six MITTs to facilitate the individual State briefings. Each two-person team was assigned nine States; meetings began on July 19, 2005 and within 112 calendar days the teams completed 55 State visits. Minutes of the meetings were documented and cataloged so that they could be aggregated and analyzed as a group.


DHS standardized the meeting minutes as much as possible. The same diversity and uniqueness of State issues that necessitated the individual State discussions, however, also prevented the level of standardization needed for traditional quantitative analysis. MITTs produced narrative summaries of the comments and concerns expressed in the meetings. Issues and possible recommendations for the future raised at the State level were reported to DHS, along with cross-cutting themes that consistently emerged in these discussions.


While the number of issues per meeting and attendance varied throughout the course of the project, participants were generally supportive of the overall NPG effort, including the seven national priorities for homeland security and the adoption of a capabilities-based planning strategy. The total number of issues and associated recommendations generated in the State meetings was 542. Analysts were able to identify 63 cross-cutting themes and key strategic issues from the aggregated list that detailed the concerns of all States, territories and the District of Columbia.


Two overarching categories emerged as the principle themes for improvement: 1) DHS presence in the field is not sufficient to support program continuity and sustainability; and 2) Federal agencies are neither sufficiently coordinated nor universally committed to achieving the NPG. The State meetings were generally viewed by participants as a significant step forward in the effort to implement the NPG. State government and homeland security officials generally supported the coordination efforts and expressed the desire for increased participation on all levels.


Although it may not be possible to adopt the exact MITT program methodology for the proposed Survey of State and Local Government Emergency Officials, the program highlights the existence of working homeland security focal points in each of the 55 jurisdictions covered in the study. These focal points may prove useful in distributing the proposed survey, ensuring that the respondent pool is distributed broadly enough to provide an adequate representation. Records on meeting attendance may also be a useful predictor of survey participation at the State level and could be used for planning purposes when designing the survey sample. Additionally, although overarching themes were identified to facilitate policy development, the individual State issues and reports were incorporated in shaping the content questions for the proposed survey. The MITT program’s interaction strategy and results can be built upon to strengthen the upcoming DHS survey of State and local officials.


Recommendations from Homeland Security Grant Program (HSGP) National Asset Data Base (NADB)

The NADB is an inventory of the nation’s critical infrastructure and key resources compiled from Federal, State, local, and private sector input. The management of the NADB, carried out by the Office of Grants and Training (G&T) and the Infrastructure Protection Risk Management Division (RMD), provides insight into the importance of accurate data collection and provision of proper guidance materials to officials.


In June 2006, the Department’s Office of Inspections and Special Reviews, Inspector General (OIG) identified the need to utilize a web-based mechanism for accurate data collection. The HSGP subsequently began using a web-based collection tool known as iMapData to gather relevant asset data from States, territories, and urban areas. The iMapData system provides a DHS-verified list to which users may upload asset data by selecting a desired area of interest. This form of data collection aims to use predefined assets in order to limit duplicative and dubious entries into the system, therefore allowing the NADB to provide a comprehensive and accurate representation of the nation’s infrastructure.


The accuracy of the data collected by State, territorial, and urban area officials is further ensured by materials that serve to guide the officials through the reporting process. While a 2003 data cull provided minimal guidance, DHS issued “Guidelines for Identifying National Level Critical Infrastructure and Key Resources” in 2004. These guidelines provide infrastructure categories, as well as parameters to disaggregate the level of criticality and to identify the assets of State, territorial, and urban area officials. In 2005, the Assistant Secretary for Infrastructure Protection approved an NADB taxonomy that created a standard national infrastructure terminology.


Because DHS placed increasing emphasis on risk analysis, the use of guidance materials for State, territorial, and urban area officials has evolved into a collaborative effort. Officials are now actively involved in this process through working groups and increased interaction with DHS personnel. These efforts will leverage expertise at all levels of government to create the most accurate representation of the nation’s infrastructure to date.


Rural Crime and Justice Center: Nationwide Rural Area Law Enforcement Study – Comprehensive Training Evaluation Report

The Federal Law Enforcement Training Center (FLETC’s) Office of State and Local Training (OSL) was established in 1982 to provide advanced and specialized training programs that reach out to small towns and rural areas, providing local law enforcement agencies with personalized and internet training at little or no tuition cost. The broad spectrum of programs includes training on topics such as: Anti-Terrorism Intelligence Awareness, Domestic Terrorism and Hate Crimes, Drug Enforcement, and First Responder, as well as many train-the-trainer programs.


In September 2004, the Rural Crime and Justice Center (RCJC) evaluated programs of the National Center for State and Local Law Enforcement Training according to the Nationwide Rural Area Law Enforcement Study (NRALES). All students who had attended a minimum of half the scheduled courses were included in the survey. Six training programs were represented in the evaluation, which used pre- and post-test, three- and six-month follow-up evaluations.


Overall recommendations following the study included:


  • Improving communication between the Rural Crime and Justice Center and the Federal Law Enforcement Training Center, namely through a three-day conference between program specialists, RCJC staff, and contracted instructors

  • Establishing a baseline of knowledge and performance measurement through the pre- and post-test model, with a more standardized method of administration

  • Provide PowerPoint® and other materials specifically as instructional tools


Training Program Topics and Background Information


Domestic Violence Training Program

The Domestic Violence Training Program (DVTP) is a five-day, train-the-trainer program which teaches applications for trainers delivering drug enforcement applications to officers in rural jurisdictions. Participants learn new ways to present various domestic violence-related topics and are provided a detailed program guide and instructional aids to set up in-service training. They serve as adjunct FLETC instructors for their respective jurisdictions and for the region. Program recommendations based on the survey for DVTP included the need for more adult facilitation training, demonstrated by the instructor’s methods in the train-the-trainer session.


First Responder Training Program

The First Responder Training Program (FRTP) is a three-day program that covers first response of law enforcement personnel to major incidents. AARs from previous incidents from criminal acts to natural disasters provide lessons learned, utilized by the program to identify specific areas of concern for law enforcement agencies. The FRTP provides guidance for future response to major incidents, including WMD, Incident Command Systems, and Critical Incident Response. Participants in this program recommended more hands-on instruction to maintain interest, a longer class time to cover all topics thoroughly, and more prepared and targeted instructor presentations.


Drug Law Enforcement School for Patrol Officers

The Drug Law Enforcement School for Patrol Officers (DLESP) provides training to police officers and sheriff’s deputies in a 2-day program focused on drug-related crime detection in their streets and communities. Training topics include the development of reasonable suspicion and probable cause, measures on responding to observations, and the development of a drug case towards a successful resolution. A pre-requisite of this program is for participants who are full-time, sworn law enforcement peace officers with little or no drug enforcement training or experience. Those who took the course reported via the survey that the program objectives needed clarification and reinforcement for registrants, more active teaching techniques, and a longer course time to allow for administrative and other items.


Drug Task Force Supervisors School

In order to address the operation of multi-jurisdictional rural crime, drug enforcement task forces are trained on management and supervision through the 5-day Drug Task Force Supervisor School (DTFSS). The advanced program, open to experienced law enforcement trainers and supervisors overseeing multi-jurisdictional task forces, includes courses on funding and budgeting, liability and risk management, standard operating procedure for task forces, and operational concerns. The satisfaction survey for this training program showed that contact information for post-training consultation and other background information given beforehand would be valuable.


Hate and Bias Crime Training Program

Another example of a train-the-trainer program, the 4-day Hate and Bias Crime Training Program (HBCTP) is designed to improve effectiveness of law enforcement agencies in recognizing, reporting, investigating, and prosecuting hate and bias crimes. Participants examine hate crimes on several jurisdictional levels, and work on developing a practical approach for their own jurisdictions. Participants in this program take training aids back to their jurisdictions to set up in-service programs of their own. Participants in this program requested more discussion and student involvement time during the training.


Drug Enforcement Training Program

Presentation material and training techniques for drug enforcement applications are part of the 5-day train-the-trainer Drug Enforcement Training Program (DETP), given for officers in rural jurisdictions. Trainers receive program guides, student handouts, instructional aids, and applicable exercises on subjects such as Informant Management, Surveillance Techniques, and Undercover Operations. Courses are tailored to issues of small town and rural agencies and include instructional methodology training. Those who enrolled in this course would have benefited from more hands-on, applied participation and more polished visual aids for the training.


Methodology

The following describes the structure and details of the survey conducted on law enforcement trainings. A similar type of survey was used across six different training programs. This survey contains general demographic evaluation, pre-post test evaluation, program evaluation, and follow-up evaluation. The questions within each evaluation vary depending on the type of training program; however, a consistent methodology is implemented in the survey.


General Demographics

This first section of the survey gathers information on general background of the population attending the trainings. Participants are asked to provide not only personal information (e.g., age, gender, race, education) but also information on their drug-related work experience; staff and other information on the departments they work for; past experiences in the job-related or drug-specific trainings; and expectation for the specific training they are attending.


Pre-Post Test Evaluation

Two sets of tests are administered before and after the training. Each test has 20 multiple-choice questions that test participants’ knowledge on specific law enforcement materials. The pre-test assesses attendee’s knowledge base prior to attending the training while the post-test, which is identical to the pre-test, assesses any improvement in the attendee’s knowledge as an effect of the training. By comparing the number of attendees or the percentage of attendees who answered the questions correctly before and after the training, the degree of knowledge transfer can be measured. For example, if the number of correct answers remained the same or increased from the pre-test to the post-test, then the result indicates that the participants learned some new information during the training.


Program Evaluation

Program Evaluation assesses if the expectation on the training has been met. Participants evaluated whether or not they have acquired additional knowledge that could be applied to their position and got a better understanding on a specific issue as a result of the training. A list of statement regarding the training program is provided with a 1 to 5 scale where attendees can indicate the extent to which they agree or disagree with the statements. In the scale, 1 signifies “Strongly Disagree,” 2 “Disagree,” 3 “Neutral,” 4 “Agree,” and 5 “Strongly Agree.”


The program evaluation also assesses the overall training in terms of content of the class, schedule, instructors, training facility, and discussion topics.


Follow-Up Evaluation

Training Relevance:

This section has multiple-choice questions that ask participants how often they use information received from the training in their job, and if there are any obstacles using the training material in their work. Participants are also asked to rate statements regarding the usefulness of course material and applicability of the skills and knowledge received from the training to their job in the scale of 1 to 5 (1 signifying “Strongly Disagree, ” 2 “Disagree,” 3 “Neutral,” 4 “Agree,” and 5 “Strongly Agree”).


Content Usefulness

Participants are asked to rate each of the subject areas covered during the training program using the scale of “Not Very Useful,” “Not Useful,” “Neutral,” “Useful,” and “Very Useful.” Other options are “No Opinion” and “Do Not Recall Section.” The table below is a list of subject areas in different training programs that are provided to evaluate. In addition, participants are asked to compare the content of this training to other training programs they have attended and rank top three subject areas that have been most useful in their job.

Figure E.4 Program Subject Areas in the Federal Law Enforcement Training


DVTP

FRTP

DLESP

DTFSS

HBCTP

DETP

Program Subjects Area

Adult Learning and Facilitation

Past Responses to Major Incidents

Drug Recognition

Management of a Task Force

Adult Learning and Facilitation

Adult Learning and Facilitation

Training Program Development

Medical Emergencies and the First Responder

Drug Field Testing

Memorandum of Understanding

Training Program Development

Training Program Development (i.e., Drug Identification Block)

Evolution of Responses to Domestic Violence

WMD and the First Responder

Developing Reasonable Suspicion

Developing Standard Operating Procedures

History and Definitions

Evidence Handling

Dynamics of Domestic Violence

Response to an Active Shooter

Looking Beyond the Ticket

Task Force Funding Management

Recognition and Elements

Indoor/Outdoor Marijuana Operations

Diversity Considerations in Safety Planning

Workplace Violence Safety Plan

Roadside Interviewing

Case Management

Recognition of Organized Hate Groups

Patrol Interdiction

Legal Definitions and Issues

Incident Command Systems

Enforcement Options

Resources

International Terrorism

Case Management

911 Emergency Call Taking

First Responder Duties

Raves and Club Drugs

Task Force Liability

Legal Considerations

Confidential Source Management

Officer Safety

Crisis Intervention

Indoor Marijuana Cultivation and Investigations

Informant Management Program

Initial Response Procedures

Risk Management/Undercover Operations

Investigations, Follow-up Investigations & Special Cases


Clandestine Laboratory Safety and Awareness

Evidence and Property Management

Role of Law Enforcement

Physical Surveillance

Domestic Violence Report Writing


Operational Planning

Investigative Strategies

Technical Surveillance

Interviewing Techniques

Risk Management

Intelligence Collections

Parcel Interdiction

Offender Accountability


Supervisory Role, Proactive/

Community Policing

Operational Planning

Civil Warrants Process

Community relations

Clandestine Laboratory Safety

Coordinated Community Response and COPS

Media Relations


Note: DVTP: Domestic Violence “Train-the Trainer” Program

FRTP: First Responder Training Program

DLESP: Drug Law Enforcement School for Patrol Officers

DTFSS: Drug Task Force Supervisor School

HBCTP: Hate and Bias Crime Training Program

DETP: Drug Enforcement Training Program


Training Impact

Students are asked to provide advice or changes for the next training program based on their experience and the ways in which they had used the material since the training.


NRALES Sample Distribution

The table below demonstrates the frequency of training session and the number of participants in each training program conducted by FLETC’s Office of State and Local Training from 2003 to 2006. The highest number of training session (5) was provided in First Responder Training Program (FRTP) whereas the highest number of students (163) attended Drug Law Enforcement School for Patrol Officers (DLESP) program. Seven out of total sixteen law enforcement trainings (44%) were conducted in FEMA Region 4 with 210 attendees. The concentration on FEMA 4 Region indicates that the emphasis of training is on the area which is frequently battered by natural disasters.

Figure E.5 Participant Distribution in Nationwide Rural Area Law Enforcement Training

Note: Each rectangle in the column represents a single training course offered under the specific program. DVTP, for example, conducted four trainings in different time period.



Takeaways

The evaluation report of the Nationwide Rural Area Law Enforcement Study (NRALES) includes some best practices that have potential applications to the Survey of State and Local Government Emergency Officials as well as future DHS policy measures. In focusing on the training and homeland security needs of small town and rural area law enforcement agencies, the study was used as a reference in devising survey questions on demographics relevant to rural areas; including emergency preparedness officials and programs, knowledge and expectations of programs. The methodology of the study, including three- and six-month follow-up, demonstrates the impact and usefulness of training. This could also be factored into contextualizing the results of the Survey of State and Local Government Emergency Officials. Overall, the NRALES demonstrates the need for improved communication between federal and rural law enforcement training centers, as well as the need for established performance measures and useful instructional tools.


Recommendations from State Homeland Security Assessment and Strategy (SHSAS) Program, Technical Assistance 2004 Conference After-Action Reports (AARs)

T

Figure E.6 Emergency Response Participants

he State Homeland Security Strategy and Assessment (SHSAS) Data Review Project Technical Assistance provides technical assistance to SAAs and their jurisdictions to enhance the capabilities of State and local representatives to prepare for and respond to a WMD. The technical assistance provided is intended for state-level administrators who oversee strategic planning efforts related to WMD terrorism response.


The project was divided into seven distinct modules, during which questions posed by the participants, as well as the answers provided by the workshop facilitators, were documented. Not every jurisdiction asked questions in every module. An AAR, divided into eight sections, followed the workshop and provided the jurisdictions with an opportunity to make comments on the workshop as a whole.

To ensure consistency, the data was assessed by a single reviewer. Each module and each AAR section was approached separately. All the questions asked and comments made by the 30 participating jurisdictions were grouped together according to question or comment type. The percentages of the different question and comment types were calculated from this data. The resulting percentages, in addition to the number of jurisdictions that asked questions and the number of questions asked in each module, were then used to compare the various modules.


The project began with a pre-workshop planning meeting for the participants in each jurisdiction to discuss their expectations of the workshop and to receive a general overview, agenda, and logistics information. The planning meeting was followed by an overview, which listed the total number of emergency response participants in each jurisdiction’s workshop and provided a breakdown of the services those individuals represented. Each jurisdiction’s workshop had between one and 19 participating emergency response personnel. Of the 222 total emergency response participants, emergency management personnel represented 58% of the total participants, governmental administrative personnel represented 15%, and law enforcement personnel represented 12%. None of the other categories, including fire, HazMat, EMS, public works, public health, health care, or public safety communication comprised more than 2% of the total number of participants.


The next phase of the workshop contained the seven modules. The Needs Assessment Module and Threat and Risk Assessment Module represent the sections in which the most questions were asked and the most jurisdictions asked questions. Half of the questions in the Needs Assessment Module focused on how teams (e.g. HazMat, public health, fire rescue, and decontamination) should be defined, organized, and counted.


On the other end of the spectrum, the Assessment, Update, and Submit Module and the Next Steps Module may have been the areas of least concern for the thirty jurisdictions, as evidenced by the relatively small number of questions asked by a relatively small number of jurisdictions. A moderate number of questions were asked in the other three modules, and few jurisdictions abstained from asking questions in those modules. The questions in these three modules, similarly to those in the Needs Assessment Module and Threat and Risk Assessment Module, maintained a focus on how to define and record personnel and data.


Upon concluding the workshop, the AAR was conducted to review the preparation, delivery, and recovery of the workshop. The AAR was divided into eight sections, each one addressing a different issue. The section prompting the most comments (31) was the Training Materials section. Most comments in this section criticized the fact that some materials (such as batteries and Ethernet cords) and documents (such as Quality Assurance Reviews and flowchart packets) were missing. After the Training Materials section, the Workshop Preparation/Setup section provoked the second largest number of comments (16). The comments in this section centered on the importance of preparation materials such as laptops with projectors, as well as prep time, to the success of the program.


Most jurisdictions found the workshop effective and beneficial. A total of 267 questions were answered by workshop facilitators; only eight questions remained unanswered. The workshop not only assisted jurisdictions in reaching a better understanding of how to organize and utilize their emergency response databases, but it also highlighted areas of confusion in the State Homeland Security Assessment and Strategy Development Data Review Project.


Recommendations from the State Homeland Security Assessment and Strategy (SHSAS) Program, Data Collection Tool (DCT)

In July 2003, the Office for Domestic Preparedness (ODP) launched the 2003 State Homeland Security Assessment and Strategy (SHSAS) program to serve as a planning tool for State and local jurisdictions and to assist ODP and its partners in better allocating Federal resources for homeland security. The assessment examined threats, vulnerabilities, capabilities, and needs related to State and local jurisdiction preparedness for WMD incidents in light of post-9/11 realities in order to inform them of the development of State and urban area homeland security strategies.


States and local jurisdictions entered assessment data into the Data Collection Tool (DCT), the online application that ODP provided to facilitate the collection of the assessment data. States were allowed discretion in defining the granularity of the jurisdictions being assessed to accommodate geographic and population differences between the States. Jurisdictions included, but were not limited to: counties, municipalities, tribal nations, and/or national parks; response regions (containing many cities and counties); and the entire State.


A significant amount of general user feedback and specific recommendations were submitted by 48 States (information for New York and Vermont was not available), 4 U.S. territories, and the District of Columbia. A total of 1,000 jurisdictions (States, counties, cities, towns, etc) cited more than 1,700 recommendations excluding “not applicable” submissions. Each recommendation was categorized into one of the seven types of programs (grant management, intelligence sharing, training, incident management, regional coordination, critical infrastructure prioritization, and long-term homeland security) listed in Homeland Security Appropriations Act of 2006 (H.R. 2360). Georgia and Washington provided the highest number of recommendations relative to the number of responding jurisdictions. Conversely, Mississippi and North Dakota provided the fewest number of recommendations relative to the number of responding jurisdictions.


The type of recommendations housed in the DCT was varied, ranging from user-friendliness issues to suggestions for ways to better allocate Federal resources for homeland security. Approximately 30% of the 1,700 relevant recommendations cited the inadequate user-friendliness of DHS programs. A majority of these user-friendliness concerns were followed by suggestions to revise and improve the online DCT application. Some of the most common grievances with the DCT online system were associated with the duration of the process, the ambiguity of the application, and redundant questions. A city in Virginia cited the following complaint which generally summarizes the common issues with the online tool: “This data submission tool is poorly prepared, labor intensive and does not accurately determine the individual jurisdictional needs.” Further exacerbating theses concerns was the “lack of time” leading to a “rush to get information and probably inaccuracies in some of the data.” Approximately 1% of the recommendations cited a positive user-friendliness rating of DHS programs and the DCT tool.


I

Figure E.7 SHSAS Data Composition


n addition, the recommendations were categorized as either citing an effective or ineffective allocation of Federal homeland security resources. Nearly 10% of the recommendations cited the ineffectiveness of various aspects of DHS programs while only 2% cited their effectiveness. The most common complaint was among rural communities who expressed frustration with their relatively diminutive role in the allocation of DHS resources compared to urban jurisdictions. A county in Texas stated that the “assessment was not prepared with small jurisdictions in mind.” Some of the most typical complaints regarding rural communities include insufficient funding, inadequate staffing, and DCT application questions not tailored to their community type. Some jurisdictions felt that the SHSAS Program was “enlightening” and “very in-depth”; however, many cited issues with the program’s implementation.


Nearly half of the applicable recommendations (860) could not be categorized into one of the seven types of programs outlined in H.R. 2360. Of the remaining 50% of recommendations (884) nearly 47% (403) pertained to training. Several jurisdictions cited the need to train not only first responders but local homeland security officials as well. One county in California cited the need for funding to “ensure all responders are trained at the appropriate level.” The other common types of recommendations pertained to regional coordination (16%) and incident management (14%). About 41% of the regional coordination recommendations also pertained to training and incident management with a lot of jurisdictions encouraging regional training and exercises. There were a significant number of jurisdictions who cited a complete inability to adequately respond to a WMD incident. Only about 3% of the recommendations pertained to intelligence sharing which was the fewest of any of the seven program types.


The feedback from the SHSAS recommendations produced several general findings that can be used to improve the user-friendliness and effectiveness of future DHS surveys. For instance, when soliciting feedback of small rural communities that may not have a large local homeland security staff, researchers should ensure that the process is expedient. In addition, when designing an online tool, the system should be thoroughly tested, clearly structured, and comprehensively designed to include all types of reporting jurisdictions. Furthermore, it is necessary to avoid confusing terminology and an abundance of abbreviations. The survey language should be simple, clear, and explained in detail. Finally, support staff, both for the website and to address survey questions, should be readily available throughout the duration of the survey.


National Emergency Management Baseline Capability Assessment Program (NEMB-CAP) Progress Report

DHS and the Federal Emergency Management Agency (FEMA) developed a report on the interim findings of the National Emergency NEMB-CAP for the period of January 2003 through December 2004. This report was based on the findings of an independent validation team conducting analysis of state emergency management capabilities against national standards. DHS conducted the assessment using Emergency Management Accreditation Program (EMAP) standards which consist of 54 individual standards across 14 functional areas. DHS established a cooperative agreement with the Council of State Governments, through which EMAP conducted the peer assessments of all States and state-level jurisdictions by 2005. The assessment process included DHS conducting on-site evaluations at various State and local jurisdictions with teams of five to seven emergency management personnel.


Of the 56 targeted jurisdictions for this review, 35 (63%) States participated. Of these 35 States, 2 were fully compliant (100%) with all 54 standards across the 14 functional areas. The two EMAP functional areas indicating a high level of compliance were Logistics & Facilities and Hazard Mitigation. The most significant deficiencies existed in three functional areas: Planning, Hazard Identification & Risk Assessment, and Operations and Procedures. The average rate of compliance for the 35 States across all of the 54 EMAP standards was 53%. The conclusions of this report indicated that the functional areas considered the most critical to operational efficacy were among the areas with the highest rates of noncompliance. The other critical areas indicating low-levels of compliance implied a systemic lack of interagency coordination. The findings and conclusions of this report indicated several high-level deficiencies across the participating States.


In the report, DHS discussed concerns regarding the assessment report quality and detail. DHS noted that participating States, observers, and assessment team members highlighted imprecise standards and subjective reviews as concerns. Also, many participants stated that several standards were overly complex and long. This degree of confusion, ambiguity, and perceived subjectivity undermined the findings of the NEMB-CAP Progress Report. The concerns cited in this report emphasize the importance of clarity, simplicity, and objectivity for any assessment being distributed broadly to a diverse audience. Furthermore, in citing these concerns, the report identifies its potential inadequacies before others can critique it.


Transportation Security Administration (TSA)/Office of Intelligence Customer Satisfaction Survey

This one-page survey was conducted to evaluate customer satisfaction on the “Weekly Filed Intelligence Summary” which is produced by the TSA. The “Weekly Filed Intelligence Summary” is a tool to ensure that TSA is providing the information that their customers need. 51 TSA customers participated in the survey.


The survey is composed of four questions which can be responded to with a simple “Yes” or “No” answer. In addition, respondents were asked to provide additional comments on the product at the end of the survey. The questions assess the relevance of the product to customers’ needs, timely delivery of the product, clarity and logical presentation of information, and its impact on improving transportation security.


Since the product is a weekly publication, the “user-friendliness” of the product is best addressed by the question that asks if the product presented the information in a clear and logical manner, which helps customers to read and absorb the information easily.


The “effectiveness” of the product can be measured by the participants’ responses to the relevance of the product to customers’ needs and its impact on improving transportation security. The responses to the timely delivery of the product can also be used as a measure of “effectiveness” as outdated intelligence information will not be as effective as on-time information.


Examples of Applicable OMB Measures

DHS reviewed the 2008 OMB Performance Measures for Preparedness. From this analysis, DHS identified several measures that are already used to assess DHS interaction at the State and local level. Examples such as these twenty measures are used as performance metrics to evaluate the effectiveness at interacting with State and local entities.


PREP0003.02 Cyber Security (CS)


Measure Type: Output


Performance Measure: Length of time to notify the Government Forum for Incident Response and Security Teams (GFIRST) and/or law enforcement officials within two hours of verifying a Category 1 or a Category 2 cyber security event.


Measure Description: None


Measure Type: Output


Performance Measure: Percent of targeted stakeholders who participate in or obtain cyber security products and services.


Measure Description: This measure assesses the impact of National Cyber Security Division (NCSD) activities targeting multiple stakeholders and NCSD’s success in building effective partnerships with its stakeholders. As the NCSD is able to reach a greater number of organizations and individuals, its awareness of the need for and the means of protecting cyber space increases and they act to implement NCSD recommendations to improve cyber space.


PREP0004 Grants, Training & Exercises


Measure Type: Outcome


Performance Measure: Percent of jurisdictions demonstrating acceptable performance on applicable critical tasks in exercises using G&T approved scenarios.


Measure Description: This measure evaluates jurisdictions’ performance on Homeland Security Exercise and Evaluation Program (HSEEP) critical tasks in homeland security exercises. Measuring improvements in jurisdictions’ performance on critical tasks over time reflects the impact of G&T preparedness activities on jurisdictions’ overall preparedness levels. To measure preparedness levels, critical task analyses included in exercise AARs are evaluated using HSEEP Exercise Evaluation Guides (EEGs) to determine whether the jurisdiction’s performance met expectations or required improvement. Jurisdictions’ performance on each critical task is analyzed by comparing the results documented in the AAR to the expected outcome described in the EEG.


PREP0004.02 State Preparedness Grants Program


Measure Type: Process


Performance Measure: Percent of State and local homeland security agency grant recipients reporting measurable progress towards identified goals and objectives to prevent and respond to terrorist attacks.


Measure Description: Collecting the dates on which the Director of G&T approves the grant until the Office of the Comptroller awards the grant. This measure will demonstrate the program's effectiveness in the grants management process.


PREP0004.03 UASI Grants


Measure Type: Output


Performance Measure: Percent of participating urban area grant recipients reporting measurable progress made towards identified goals and objectives to prevent and respond to terrorist attacks.


Measure Description: TBD - New Measure



PREP0004.04 State and Local Training


Measure Type: Outcome


Performance Measure: Average percentage increase in WMD response and other KSAs of State and local homeland security preparedness professionals receiving training from pre- and post-assessments.


Measure Description: This measure evaluates improvements in State and local homeland security preparedness professionals’ KSAs due to delivery of training. Measuring these improvements indicates the impact of training services on the nation’s preparedness level. The measure is calculated using student self-evaluations administered by G&T training partners before and after delivery of training courses.


Measure Type: Output


Performance Measure: The number of State and local homeland security preparedness professionals trained each year.


Measure Description: This measure assesses the overall scope and reach of G&T’s State and Local Training Program. Measuring the number of homeland security preparedness professional trained each year reflects the impact of G&T’s Training Program on improving homeland security capabilities. G&T’s Centralized Scheduling Information Desk (CSID) maintains a database tracking the total number of homeland security preparedness professionals trained each year.


PREP0004.05 National Exercise Program


Measure Type: Outcome


Performance Measure: Average satisfaction rating by exercise participants.


Measure Description: This measure assesses how satisfied individual exercise participants are with the direct support exercises funded through the Office of Grants and Training (G&T). Exercise participants’ satisfaction reflects G&T’s ability to provide effective and useful exercise opportunities to Federal, State, and local jurisdictions. Exercise participants’ average satisfaction rating is based on feedback obtained through exercise participant surveys.


Measure Type: Output


Performance Measure: Number of grant funded exercise projects per year.


Measure Description: This measure is designed to assess the number of improvement plan action items that jurisdictions implement/execute following G&T-funded or supported exercise. Determining the percent of action items that are implemented reflects the impact of the National Exercise Program (NEP) on jurisdictions’ ability to identify and resolve issues and/or preparedness gaps. Data is collected from exercise AARs that include improvement plans and from participating jurisdictions’ responses to an online survey on action item implementation.


Measure Type: Outcome


Performance Measure: Percentage of action items identified in AARs that were implemented.


Measure Description: This measure is designed to assess the number of improvement plan action items that jurisdictions implement/execute following G&T-funded or supported exercise. Determining the percent of action items that are implemented reflects the impact of the NEP on jurisdictions’ ability to identify and resolve issues and/or preparedness gaps. Data is collected from exercise AARs that include improvement plans and from participating jurisdictions’ responses to an online survey on action item implementation.


PREP0004.06 Technical Assistance


Measure Type: Outcome


Performance Measure: Average satisfaction with Technical Assistance services provided to State and local jurisdictions.


Measure Description: This measure is intended to assess the effectiveness of the Technical Assistance Program by evaluating the satisfaction of States and local jurisdictions that have used their services.


Measure Type: Outcome


Performance Measure: Percentage of homeland security strategies that are compliant with DHS planning requirements at the submission date.


Measure Description: This measure assesses improvements in the thoroughness and completeness of homeland security strategies submitted by State and urban area jurisdictions to G&T. The measure reflects the Technical Assistance Program’s goal of strengthening and improving the homeland security strategy process. Data for this measure are derived from G&T’s review board process through which updated homeland security strategies are reviewed and approved.


PREP0006.03 Assistance to Firefighters Grants


Measure Type: Outcome


Performance Measure: Ratio of on-scene fire incident injuries to total number of active firefighters.


Measure Description: This measure assesses improvements in firefighter safety in jurisdictions receiving Assistance to Firefighters Grant (AFG) funds. The ratio of firefighter injuries to active firefighters reflects the effectiveness of AFG funds in promoting firefighter safety through its support for firefighter training, wellness programs, and protective equipment. Data for this measure reflects information collected through a survey sent to AFG recipients.























This page intentionally left blank.

              1. ACRONYM LIST


AAR After-Action Report

ACSI American Customer Satisfaction Index

AFG Assistance to Firefighters Grant

BEA Bureau of Economic Analysis

CBP Customs and Border Protection

CIO Chief Information Officer

CISO Chief Information Security Officer

COE Center of Excellence

COG Continuity of Government

CPSC Consumer Product Safety Commission

CS Cyber Security

CSI Container Security Initiative

CSID Centralized Scheduling Information Desk

CTA Conservation Technical Assistance

C-TPAT Customs-Trade Partnership Against Terrorism

DBT Design Basis Threat

DCT Data Collection Tool

DETP Drug Enforcement Training Program

DHS Department of Homeland Security


DLESP Drug Law Enforcement School for Patrol Officers


DOI Department of the Interior

DTFSS Drug Task Force Supervisor School

DTIC Defense Technical Information Center

DVTP Domestic Violence Training Program

EEG Exercise Evaluation Guide

EMA Emergency Management Agencies

EMAP Emergency Management Accreditation Program

EMS Emergency Medical Services

EOP Emergency Operations Plan

FBI Federal Bureau of Investigation

FEMA Federal Emergency Management Agency

FLETC Federal Law Enforcement Training Center

FPS Federal Protective Service

FRTP First Responder Training Program

FS Fire Service

GAO Government Accountability Office

GFIRST Government Forum for Incident Response and Security Teams

GSA General Services Administration

G&T Office of Grants and Training

HBCTP Hate and Bias Crime Training Program

H.R. House Resolution

H.Rept. House Report

HSAS Homeland Security Advisory System

HSEEP Homeland Security Exercise and Evaluation Program

HSGP Homeland Security Grant Program

HSPD-8 Homeland Security Presidential Directive 8

KSA Knowledge, Skills, and Abilities

LLIS Lessons Learned Information Sharing

MIPT National Memorial Institute for the Prevention of Terrorism

MITT Mobile Implementation Training Team

MIX Metropolitan Information Exchange

NACo National Association of Counties

NADB National Asset Data Base

NASCIO National Association of State Chief Information Officers

NCR National Capital Region

NCSD National Cyber Security Division

NEMA National Emergency Management Association

NEMB-CAP National Emergency Management Baseline Capability Assessment Program

NEP National Exercise Program

NGA National Governors Association

NPG National Preparedness Goal

NPR Nationwide Plan Review

NPTF National Preparedness Task Force

NRALES Nationwide Rural Area Law Enforcement Study

NRC Nuclear Regulatory Commission

NRCS Natural Resources Conservation Service

ODP Office for Domestic Preparedness

OIG Office of Inspections and Special Reviews Inspector General

OMB Office of Management and Budget

OSL Office of State and Local Training

PART Program Assessment Rating Tool

PCR Program and Capability Review

PINS Planning Inspectorate

RCJC Rural Crime and Justice Center

RMD Risk Management Division

SAA State Administrative Agency

SHSAS State Homeland Security Assessment and Strategy Program

SHSGP State Homeland Security Grant Program

SSA Social Security Administration

TCL Target Capabilities List

TSA Transportation Security Administration

UASI Urban Area Security Initiative

UAWG Urban Area Working Group

US-CERT United States Computer Emergency Response Team

USDA United States Department of Agriculture

WMD Weapon of Mass Destruction
























This page intentionally left blank.


1 Likert Scale: A scaling technique often used in surveys whereby items that are statements of belief or intention are generated. Each item is then judged by respondents as to whether it reflects a favorable or unfavorable attitude towards the object in question.

2 “Questions and Answers When Designing Surveys for Information Collections”, Office of Information and Regulatory Affairs, Office of Management and Budget, January 2006

3 Ibid.

4 Salazar, Ken: Senator Salazar Releases Final Results of Homeland Security Survey, June 20, 2005.

5 Salazar, Ken: Memorandum: Homeland Security Survey: Improving Government Coordination, April 4, 2005.

6 Salazar, Ken: Senator Salazar Releases Final Results of Homeland Security Survey, June 20, 2005.

7 Comments of Senator Ken Salazar regarding amendment 1207 to the DHS Appropriations Act, 2006 – July 13, 2005.

Page 3

2/6/2021

File Typeapplication/msword
Authorbwolvin
Last Modified Bybwolvin
File Modified2006-09-20
File Created2006-09-20

© 2024 OMB.report | Privacy Policy