OMB IRC for EAC Urban_Rural 8_23_12

OMB IRC for EAC Urban_Rural 8_23_12.docx

Survey of Urban and Rural Election Administration

OMB: 3265-0017

Document [docx]
Download: docx | pdf


Shape1


U. S. ELECTION ASSISTANCE COMMISSION

1201 New York Avenue, NW, Suite 300

Washington, DC 20005




Responses to supplemental information requested by OMB regarding EAC’s Urban/Rural Survey data collection effort

On May 16, 2012 EAC, in consultation with its survey contractor Human Resources Research Organization (HumRRO), submitted to OMB, its supporting statement related to a request for OMB approval to conduct a survey of local election officials on the administration of elections in urban and rural areas.

On August 6, 2012 OMB conducted a conference call with EAC and HumRRO project staff seeking additional clarifying information on a number of issues including: the longer term goals and expected outcomes of the data collection effort; the rationale for particular sampling methodologies that are to be employed; the handling of non-respondents in the data analysis; and, various methods to be used to follow up with survey non- respondents. Appropriate adjustments have been made to the original submission and are highlighted in yellow in the attached document. Further information regarding the nature of the project is provided below.

With a November general election less than 90 days away this data collection effort is time-sensitive so EAC and HumRRO research staff agreed to provide a quick turnaround response to OMB by Friday, August 10.

Background on this data collection effort

From its inception EAC has sought to execute a research agenda that provides tangible benefits to those seeking to better understand and administer American elections. Since 2004 EAC has, in order to fulfill the research agenda set forth in HAVA, conducted approximately 21 research studies. By law EAC also is mandated to collect and report to Congress certain critical Federal election data such as, the number of voters registered, the number who voted, and the number of Uniformed and Overseas Citizen Voters (UOCAVA) who cast ballots and whose ballots were counted. This data collection is done by administering the EAC Election Administration and Voting Survey (EAVS) to 55 states and territories. This biennial census of election data collection is accompanied by a Statutory Overview. The Overview is a collection or summary of key state election statutes and regulations for the 55 states and territories.

Pursuant to the law, HAVA section 241(b) (15) requires EAC to study “matters particularly relevant to voting and administering elections in rural and urban areas”. In response to this mandate EAC seeks to conduct a study providing information that, ultimately, will be useful and relevant to the elections community in its conduct of elections. In late 2009 and early 2010, EAC conducted two working group meetings involving a geographically diverse group of local election officials; participants considered the current challenges they face related to administering elections in urban and rural areas. Key issues and challenges identified during the discussions included finding efficient and effective ways to reach voters, finding ways to recruit and maintain sufficiently well-trained elections personnel and, identifying innovative ways to cut costs when managing various aspects of the elections process.

Identifying these topics and key issues provided a framework for the questions developed in the Urban/Rural Survey instrument. EAC envisages conducting this national, random sample survey of local elections officials on issues related to conducting elections in rural and urban areas as the first phase of the study.1

Expected outcomes, goals and objectives of the Urban Rural study

During its relatively brief existence EAC has been able to develop a reliable state-by-state database of certain election data and to perform key election research studies on matters Congress has deemed important. In this period EAC also has developed a robust Election Management Guidelines (EMG) program which has been informed and augmented by EAC’s research studies. The EMG program, the EAC’s research studies and their key findings, along with the EAC’s Election Official Exchange, are the key components which comprise EAC’s HAVA-mandated Clearinghouse of Election Information.

The end-goal, end-result or expected outcome of the proposed Urban/Rural survey, would be to provide useful and practical best practice information to urban and rural local officials via EAC’s Clearinghouse of Election Information.

EAC envisages that, ultimately, the findings and analysis from the Urban/Rural survey will identify certain helpful best practices for administering elections in rural and urban areas.2 These best practices would, in turn, be highlighted in one or all of the following--- a chapter in the Election Management Guidelines “textbook”; a Quick Start Guide on administering an election in an urban or rural area; a feature in EAC’s bi-monthly Newsline; an exchange of best practice information on EAC’s online Election Official Exchange, and; in an online dialogue resulting from EAC blog and Twitter posts. EAC believes all these media and venues offer prime opportunities for election officials to exchange best practice information, to learn from one another and, to problem-solve on certain challenges that are having an impact on their ability to administer elections.

Attached you will find a sample of an Election Management Guideline chapter and Quick Start Guide to Election Administration. The information garnered from the key findings and analysis of the Urban/ Rural Survey would be used, in part, to develop a similar set of educational resources on Administering an Election in a Rural area and Administering an Election in an Urban area.

Supporting Statement for Request for OMB Approval

U.S. Election Assistance Commission

Urban/Rural Study


Table of Contents


Page


A. Justification 1


1. Circumstances that make collection of information necessary 1

2. How information will be used 4

3. Use of automated collection techniques 4

4. Efforts to avoid duplication 4

5. Impact on small businesses 5

6. Consequences of not conducting 5

7. Special circumstances 5

8. Outside consultations 6

9. Remuneration 7

10. Confidentiality 7

11. Sensitive information 7

12. Estimates of hour burden 8

13. Cost to respondents 8

14. Cost to Federal government 9

15. Reasons for program changes 9

16. Plans for tabulation/publication 10

17. Display of OMB approval 13

18. Exceptions 13


  1. Collections of Information Employing Statistical Methods


1. Sampling 14

2. Procedures for collecting information 17

3. Methods to maximize response rates 18

4. Tests of procedures 19

5. Individuals consulted on sampling 21


References 22


Appendix A: Pl 107-252 A-1

Appendix B: Federal Register Notices B-1

Appendix C: Survey Instruments C-1


List of Tables


1. Contract Costs 9

2. Project Timeline 12

3. Frequencies of Elections Jurisdictions by Region and Urban/Rural Status 15

4. Distribution of Sample for the Strata 16

5. Expected Number of Completed Surveys by Stratum 16

6. Minimum Detectable Differences at the 95% Confidence Level 17


A. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.



Political scientists often refer to the work of local election officials (LEOs) as the “invisible election,” because what they do—hiring election workers, buying and maintaining voting machines, producing ballots, establishing polling locations, among numerous other things—is largely unreported. Like other “street-level bureaucrats,” LEOs, “occupy a critical position in American society… [T]he actions of most public service workers actually constitute the services delivered by government” (Lipsky, p. 3, 1983). Though their work goes relatively unnoticed, election officials at all levels of government play an important role in implementing voting laws. They also shape the ways in which people are able to vote, affect the quality of the voting experience, and can ultimately determine which votes count (Kimball & Kropf, p. 1, 2006). The extent to which LEOs perform their duties responsibly and effectively determines how free and fair our voting is.


Each state and the District of Columbia have different election systems, resulting in 51 fairly distinct approaches to holding elections. Even within each of these 51 systems, there is considerable variation in the way elections are administered. Some of this variation is simply a matter of population size. For example, a very large voting jurisdiction, such as Los Angeles County with millions of registered voters, will have different challenges and needs than a sparsely populated jurisdiction in Wyoming. Also, each state has different levels of standardization in implementing elections. Some have more centralized statutory requirements so that the voting rules are the same in every city or county, while other states’ requirements lack any standardization whatever. Similarly, states and jurisdictions within states can determine the type of technology they will use to count votes. The technology runs the gamut from direct recording electronic devices (DRE), optical scans and other ballot-marking devices (such as lever systems), to hand-counted paper ballots, and vote-by-phone systems. Finally, as the United States is a multicultural country, the diverse languages and customs of its voters can create additional challenges for LEOs.


In the wake of the 2000 presidential election, calls for election change were coming from all areas of public life—the media; LEOs; local, state and national legislators; and public interest groups too numerous to count. A very damaging joint study by the Massachusetts and California Institutes of Technology estimated that 4 to 6 million presidential votes were lost in the 2000 election. Of these, they estimate that 1.5 to 3 million were lost because of registration problems, up to 1 million because of polling place operations, and an unknown quantity of votes were lost because of absentee ballot problems (Alvarez, et al., p. 9, 2001). Similarly, the authors noted that “below the office of president the incidence of spoiled, unmarked, and uncounted ballots is much higher: five percent of ballots do not record a Senate or gubernatorial vote. And there are significant differences across equipment types in the incidence of uncounted ballots” (Alvarez et al., p.8, 2001).

Members of Congress as well as several congressional committees asked the U.S. Government Accountability Office (GAO) to review the characteristics of the 2000 election. Responding to these requests, GAO produced a series of reports spotlighting the problems found at each stage of the election process, including voter registration, early/absentee voting, organizing and conducting activities on election day, and vote counting and certification (General Accounting Office, 2001). To ameliorate these challenges, Congress enacted the Help America Vote Act of 2002 (HAVA), which authorized $3.9 billion over the following three years to be spent by states to upgrade their voting equipment and procedures. It also created an independent national clearinghouse and resource for federal election administration assistance called the Election Assistance Commission (EAC).


The EAC provides grants, voluntary testing and certification of voting systems, studies election issues to promote effective voting, and provides guidance and guidelines for voting systems and other HAVA requirements. Under HAVA 241(b) (15), the EAC is also mandated to “conduct and make available to the public studies regarding…election administration issues,” including, “matters particularly relevant to voting and administering elections in rural and urban areas.”


Historically, election researchers have been interested in the effects of voting legislation on turnout, the expansion of voter rights, voter enfranchisement, partisan alignment, and other voting behavior. Research in this arena was an attempt to understand participation in one of the most fundamental democratic processes. Because issues of gender, race/ethnicity, language ability—minority discrimination in general—have historically been the focus of voting participation studies, the question of differences between urban and rural voting behavior has been broadly overlooked. Early studies examining the correlation between urbanization and participation (Milbrath 1965; Nie, Powell, and Prewitt, 1969; Verba and Nie 1972) were inconclusive, contradictory, or they simply assumed there was higher voter turnout in urban areas without much supporting evidence (Monroe 1977).


More recently, while the treatment of urbanization in voting behavior research has been more prevalent, this work has focused primarily on partisan alignment. After the election crisis of 2000, however, scholars have paid more attention to election reform—and with it, issues such as ballot design, the relationship between socioeconomic factors and voting equipment, the factors explaining how election reform is adopted, voting errors, etc. With the passage of HAVA, the issue of urban vs. rural participation has come under more scrutiny, especially because understanding the differences is now a matter of federal statute. With respect to the urban vs. rural effects on participation, Creek and Karnes (2009) found there were differences in relative costs to becoming HAVA compliant. They also found that the experiences of election administrators were different depending on the state’s centralization of election administration and the level of cooperation between state and local officials. Certainly with HAVA, there has been an increased interdependence among federal, state, and local governments in determining election administration (Liebschutz and Palazzolo 2005).


With a growing focus more on urbanization, what is becoming clearer is that there are differences in urban and rural election officials’ abilities to comply with HAVA. These conclusions are echoed in the Pew Center on the States: Make Voting Work study of 2008 and to a lesser extent in Rachlin’s Making Every Vote Count (2006). The authors of these works note the challenges facing urban election officials as being different than those facing rural election officials. For example, “Local election officials in jurisdictions with more than a million voters and dedicated information technology staff face entirely different challenges in securing, maintaining and operating voting technology than their brethren in smaller jurisdictions” (Gronke & Caudell-Feagan, 2008, p. 13-14). Election administrators in rural areas, on the other hand, sometimes lack expertise in information technology, places to store voting equipment (Gronke & Caudell-Feagan, 2008, p.14), sufficient personnel, and/or sufficient funds to replace equipment (Rachlin 2006, p. 80).


As mentioned previously, HAVA 241(b) (15) requires the Election Assistance Commission to study matters relevant to administering elections in rural and urban areas.

With that in mind, the EAC convened a working group of Local Election Officials (LEOs) representing both rural and urban regions, as well as researchers and other experts in this field. The participants considered a variety of issues, including methods of defining “urban” and “rural,” and differences encountered between the two along such lines as voting place location and outreach efforts. The goal of this endeavor was to identify factors that potentially influence the administration of elections in areas that vary in geographic/population size and density to such an extent that they warrant further research.


Based on the discussions, EAC staff concluded that two topics of particular importance are voter outreach activities, and staffing (i.e., regular full- and part-time elections office personnel and poll workers hired on a temporary basis). It was also acknowledged that many of the variations in these two (and other) areas may be resourced based, suggesting that available funding also be included as a factor in future research.


Given that there is no single source of information on the entire range of elections administration practices, the best means for obtaining insights into the topic of interest in this study is to go to the LEOs who have first-hand knowledge of the procedures followed in their jurisdictions, as well as the challenges they face. Therefore, EAC staff decided to conduct a survey of a sample of such officials to investigate the differences in practices and problems in rural and urban districts with the goal being able to identify the sources of issues faced as well as possible solutions. In addition, the input of LEOs should point to best practices in the areas of staffing and voter outreach that can be specifically applied in urban and rural areas.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The data generated through this effort will be used directly by EAC staff as they seek to identify challenges specific to the administration of federal elections in rural and urban areas, as well as best practices that may help to address those challenges. This is a new, one-time data collection.

  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


In 2009, the EAC personnel used a variety of methods to assemble a complete listing of LEOs, including name (where available), mailing address, and email address. Unfortunately, nine states refused to provide email addresses for their LEOs, comprising about one-third of the population. In addition, given that considerable time has elapsed since the database was developed, it is likely that contact information which includes the name of the LEO (e.g., email address) will be outdated in a significant number of cases. Given these facts, the primary method of data collection will be through electronic means, by electronically notifying sample members for whom email addresses are available that the survey is being conducted, requesting their participation, and providing a URL and instructions for accessing and completing the survey. That portion of the sample for which email addresses are not available, as well as cases where such addresses are found to be invalid, will be sent a paper version of the survey. The cover letter accompanying the instrument will explain that it can be completed online and instructions for doing so will be included.


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.


As mentioned earlier, the subject of differences in the demands and challenges faced by election officials in rural and urban areas has received some attention. However, no systematic data collection involving LEOs has been undertaken. Instead, the conclusions drawn to date have largely consisted of generalizations based on macro-level knowledge of the variations found across jurisdictions (e.g., overall size of election administration budgets and staff). In other instances, researchers have relied on a case study approach. For instance, Creek and Karnes (2009) examined the experiences of officials in Maryland, New York, and Virginia in complying with HAVA using historical records regarding actions taken and interviews with officials at the state and local level. Liebschutz and Palazzolo based their conclusions using a similar approach focusing on New York, New Jersey, and Pennsylvania. Gromke and Stewart (2008) cite the difficulties in collecting data more directly from those on the front lines of election administration in the United States, but conclude that, “While registration rolls and election returns form the core of the elections data, federal, state, and local officials need to think creatively about better ways to collect information about the performance of the elections system” (p. 10). Among the approaches they recommend are sample surveys of election officials, such as the one under consideration here.


  1. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-1), describe any methods used to minimize burden.


This collection of information does not have an impact on small businesses or other small entities.

  1. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The Help America Vote Act of 2002 (HAVA) created new mandatory minimum standards for states to follow in several key areas of election administration. HAVA also established the Election Assistance Commission (EAC) to assist the states regarding HAVA compliance. In carrying out this mandate, EAC is charged with conducting research on topics related to the administration of elections, including matters related to urban/rural differences in this regard, and making the results known to the public. An in-depth understanding of the challenges faced by election officials in regards to the critical issues of voter outreach and staffing cannot be achieved given the data available. Rather, input must be received from those responsible for the management of these functions. Further, to gain an understanding of how such challenges vary based jurisdiction population and population density, it is essential that sampling be conducted to ensure a sufficient range of variation on these dimensions so that the proper comparisons can be made. This will be accomplished through the sampling plan described in Part B of this application.


In sum, lacking these data, the EAC will be unable to comply with one of the mandates it was given at the time of its creation. There are no technical or legal obstacles to reducing burden.


  1. Explain any special circumstances that would cause an information collection to be conducted in a manner: (a) requiring respondents to report information to the agency more often than quarterly; (b) requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it; (c) requiring respondents to submit more than an original and two copies of any document; (d) requiring respondent to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years; (e) in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study; (f) requiring the use of a statistical data classification that has not been reviewed and approved by OMB; (g) that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or (h) requiring respondents to submit proprietary trade secrets or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


(a) This is a one-time only data collection. Respondents will only be asked to provide input once. (b) The surveys will be composed primarily of close-ended items requiring respondents to select from available options. An “other, please specify” option will also be available in cases where the range of potential responses is such that comprehensiveness of the answers provided cannot be guaranteed. Respondents will also be presented two open-ended items allowing them to provide additional comments regarding the administration of federal elections in urban and rural settings. Whether they provide such feedback, as well as its depth, are at their discretion. We hope that the survey will be in the field 30 days or less, but do not feel that the nature of the written responses presents an undue burden. (c/d) Respondents are not being asked to submit documentation of any kind, nor are they required to retain records related to this research. (e) The sampling plan detailed in Part B was developed with the goal or obtaining adequate representation of both urban and rural districts to allow for statistically valid comparisons between the groups. In addition, the sampling method employed will ensure proportional representation of LEOs by geographic region so that analyses can be conducted to determine if there are differences in experience based on location. (f) No unapproved statistical data classifications will be employed. (g) Respondents will be guaranteed that results will only be reported in the aggregate, and that no breakdowns of the data will be reported that will allow for identification of individuals. Any information that could lead to such identification will be deleted from the database as soon as it is identified. (h) No proprietary or confidential information will be requested.

  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years—even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


The Federal Register notices are included in this package as Appendix B. Only one comment was received which questioned the value of the survey.

As mentioned previously, EAC staff conducted workshops with LEOs representing both urban and rural jurisdictions, as well as researchers and other experts in this field. Discussions centered on issues pertinent to this topic, and ideas were solicited regarding the dimensions of importance that required further study. Based on this input, a survey was created centering on the issues of importance. This instrument was reviewed by individuals experienced in survey design and suggested changes were incorporated. The draft instrument was then circulated to six LEOs identified by the EAC who were asked to complete it and take part in an interview to obtain their input on the clarity and comprehensiveness of the questions as well as suggestions for additional content. The outcomes of this process were reviewed by the project team, and alterations/additions were made where deemed advisable.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No payment or gift is being offered for participation.

  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Respondents will be assured of that the information they provide will not be attributed directly to them and that all data will be reported on an aggregate basis only. No identifying information is being collected through the survey unless the respondent provides this by agreeing to participate in more in-depth interviews regarding the topic of elections administration in rural/urban areas. When this happens, the information will be extracted to be provided to the EAC and subsequently deleted from the database. The contractor for this effort, The Human Resources Research Organization, maintains an Institutional Review Board (IRB00000257) and a Federal Wide Assurance (FWA00009492) currently on file with the Department of Health and Human Services. This committee will ensure that legally effective informed consent is obtained and respondent privacy is honored.

  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious believes, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps taken to obtain their consent.


No sensitive information is being collected as part of this effort.

  1. Provide estimates of the hour burden of the collection of information. The statement should: Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I. Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.


In structuring the survey, every effort was made to maintain a focus on the central issues of importance and to minimize the collection of ancillary information. The number of close-ended responses ranges from a minimum of 66 to a maximum of 69 when accounting for follow-on questions. The information sought should be readily available to respondents and require little to no research on their part. Therefore we anticipate that the survey will take no more than 30 minutes to complete.


  1. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14). The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities. If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic impact analysis associated with the rulemaking containing the information collection, as appropriate. Generally, estimates should not include purchases of equipment or services or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no capital or start-up costs associated with this information collection.


  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff) and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


This is a one-time data collection. No equipment, software, systems, or technology will be purchased to support this effort. The associated contractor costs are $195,330. This includes consultation on survey item development, survey design and online formatting, and follow-ups, database development and analysis. The contract costs are detailed below.


Table 1. Contract Costs

Personnel

Hours

Hourly Rate

Amount

Program Manager

280

211.61

$59,250

Senior Project Manager

181

160.09

$28,977

Consultant

144

126.75

$18,252

Software Engineer

104

122.48

$12,738

Senior Consultant

173

161.99

$28,025

Research Support

16

85.25

$1,364

Total Salaries

2,072


$148,606

Other Direct Costs




Printing, Mailing, Postage



$18,927

Data Entry



$4,193

Statistical Consultant



$23,604

Total ODCs



$46,724

Total Firm Fixed Price



$195,330




  1. Explain the reasons for program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


N/A


  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of the report, publication dates, and other actions.


The results of this study will be used by EAC staff as they seek to identify challenges faced in the administration of federal elections by those in urban and rural areas. The results may also point to practices that are particularly suited to jurisdictions that differ in the size and dispersion of their voting populations. Only aggregated data will be reported, with breakdowns on key variables that are hypothesized to have a potential impact on experiences (e.g., rural/urban, state, size of jurisdiction). Note that breakouts will not be done in situations where doing so could lead to the identity of respondents. At this time it is anticipated that the report will include an overview of respondents on key variables (e.g., location, jurisdiction size). This will be followed by a presentation of overall results regarding outreach efforts and experience with poll workers. Breakouts will then be provided along key background variables, including urban/rural designation, staff size, and available funding. The report will conclude with an overview of the results that highlights key findings regarding differences between election administration in rural and urban areas, challenges faced by each, and factors that the data suggest may be associated with overcoming challenges faced. We anticipate that this presentation will be accomplished through the use of simple frequencies and cross tabulations. Specific analyses to be conducted include, but are not limited to the following:


Report Basic Demographics (run urban/rural comparisons on all and report significant differences)

  1. Tenure—average, range (item 1)

  2. Percent elected versus appointed (item 1a)

  3. Number of registered voters—average, range (item 2)

  4. Compare urban/rural classification based on ERS data with self-reported urban/rural status. (data in database with item 3)

  5. Percent required to provide language assistance and required languages (items 4 and 4a)

  6. Percent with full responsibility for election administration and where responsibility lies for those without (items 5 and 5a, content analyze “other”)

  7. Percent allowing absentee/no-excuse absentee/early/by-mail voting)

Voter Outreach

  1. Do urban/rural districts differ in their assessment of the ease with which voter outreach activities can be conducted?

    1. Crosstab urban rural classifications with item 13

  2. Do urban/rural districts differ in the types of voter outreach they conduct

    1. Crosstab urban rural classifications with item 7

  3. Do urban/rural districts differ in their assessment of the nature of the problems experienced in conducting voter outreach?

    1. Crosstab urban/rural classifications with item 14

  4. Is there a relationship between size of district and amount spent on voter outreach?

    1. Correlate items 2 and 10/11

  5. Is there a relationship between the amount spent on voter outreach and the assessment of the ease of doing so?

    1. Crosstab items 10/11 and 13

  6. Are there differences between urban/rural jurisdictions in regard to the use of partnerships, organizations partnered with, or activities engaged in?

    1. Crosstab urban/rural classifications with items 8, 8a, and 8b

  7. Is there a relationship between assessed ease of conducting voter outreach and (a) use of partnerships, (b) organizations partnered with, and (c) types of activities engaged in with partners?

    1. Crosstab items 8 and 13, 8a and 13, and 8b and 13 (within urban/rural classifications if significant differences in 6)

  8. Is there a relationship between ease of conducting voter outreach and tenure of voting official?

    1. Group years of tenure and crosstab with item 13.

  9. Is there a relationship between LEO tenure and types of outreach activities conducted, and whether partners are used?

    1. Group years of tenure and crosstab with items 7 and 8.

  10. Is there a relationship between ease of conducting voter outreach and the requirement to provide language assistance?

    1. Crosstab items 4 and 13

    2. If so, crosstab number of languages 4(a) with 13 to see if there is an impact here

  11. Is there a relationship between level of responsibility for elections administration and ease of conducting voter outreach?

    1. Crosstab item 5 with item 13

Staffing

  1. What is the relationship between urban/rural status and staff size?

    1. Crosstab urban/rural classifications and 15/17 a/b/c/d

  2. Is there a relationship between staff size and ease of conducting voter outreach?

    1. Crosstab items 15/17 a/b/c/d with item 13

  3. What is the relationship between jurisdiction size and number of poll workers?

    1. Correlate Items 2 and 16/18

  4. Is there a relationship between urban/rural classifications and number of poll workers?

    1. Compare average number of poll workers between urban/rural classification groups

  5. Is there a relationship between urban/rural classifications and ease of recruiting poll workers?

    1. Crosstab urban/rural classifications with item 22

  6. Is there a relationship between urban/rural classifications and poll worker recruiting methods?

    1. Crosstab urban/rural classifications and sources identified in item 21

  7. Is there a relationship between recruiting methods used and assessed ease of recruiting poll workers?

    1. Crosstab items 21 and 22

    2. Do within urban/rural classifications if 17 is significant

  8. Are there differences in the assessment of the success achieved with various recruiting methods based on urban/rural classification?

    1. Crosstab successfulness ratings for each source by urban/rural classifications

  9. Are there differences between urban/rural classifications in regard to whether poll workers are paid?

    1. Crosstab urban/rural classifications and items 19/20

  10. Is there a relationship between ease of recruiting and whether poll workers are paid?

    1. Crosstab items 22 and 19

    2. Do within urban/rural classifications if 19 is significant

  11. Is there a relationship between ease of recruiting and how much poll workers are paid?

    1. Combine 19a and 20a, group data if necessary, and crosstab with 22

  12. Is there a relationship between ease of recruiting poll workers and whether split shifts are offered.

    1. Cross items 22 and 24

  13. Are there differences between urban/rural classifications and problems encountered in recruiting poll workers?

    1. Crosstab urban/rural classifications and item 23\

Open-ended responses

  1. Perform content analysis of open-ended items (8c and 25). Where possible, develop categories of response and examine by urban rural status. Also use verbatim responses to highlight approaches/issues highlighted by LEOs from urban and rural areas.



Table 2. Project Timeline

Start Date

30 September 2011

Evaluation Plan

29 November 2011

Final Surveys

30 December 2011

OMB Package

20 January 2012

OMB Clearance

20 August 2012

Survey Distribution

23 August 2012

Survey Database

24 September 2012

Draft Evaluation Report

7 October 2012

Final Report

28 October 2012




  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


N/A

  1. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


N/A

  1. Collections of Information Employing Statistical Methods


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in a tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the data collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The sample design proposed here attempts to accomplish several goals. The first is to distribute the sample sufficiently so that comparisons can be made across different size communities. This ability to make comparisons by size may be especially salutary to drawing conclusions about resource availability within counties to encourage and facilitate voting. The second goal is to distribute the sample by region to make the sample truly nationally representative.


The Economic Research Service (ERS) of the U.S. Department of Agriculture developed a system of classifying municipal areas and counties along a continuum of rural-urban status with 9 values. Table 3 presents the 9 classifications along with the number of jurisdictions in each.3 Note that in most states, LEOs are assigned at the county level, and can be directly classified using the ERS system. The exceptions to this rule are the New England states (i.e., CT, MA, ME, NE, RI, and VT), as well as Alaska and Wisconsin, where LEOs generally serve at the municipal/town level. In these instances we will assign rural/urban codes based on the county in which the town or municipality is located.


We will use this system to categorize each of the districts represented in the existing LEO database (N = 4616). The first step is to divide the sample into group sizes using the 1 through 9 categorization for counties according to size and urban\rural status. Counties that fall into categories 1 through 7 comprise the first stratum, and counties that fall into categories 8 and 9 form stratum two. Counties in both of these initial strata are further divided into regions, forming ultimately 28 plus 8 = 36 strata total.









Table 3. Frequencies of Elections Jurisdictions by Region and Urban/Rural Status

Code

Geographic Region

Metro counties:

Northeast

Midwest

South

West

Total

1. Counties in metro areas of 1 million population or more

359

106

211

39

715

2. Counties in metro areas of 250,000 to 1 million population

329

83

160

42

614

3. Counties in metro areas of fewer than 250,000 population

175

104

180

50

509

Nonmetro counties:



0



4 Urban population of 20,000 or more, adjacent to a metro area

178

65

93

30

366

5. Urban population of 20,000 or more, not adjacent to a metro area

71

45

32

24

172

6. Urban population of 2,500 to 19,999, adjacent to a metro area

240

184

336

59

819

7. Urban population of 2,500 to 19,999, not adjacent to a metro area

254

178

163

84

679

8. Completely rural or less than 2,500 urban population, adjacent to a metro area

68

75

117

33

293

9. Completely rural or less than 2,500 urban population, not adjacent to a metro area

30

224

132

63

449

Total

1704

1064

1424

424

4616


All counties in the second stratum are sampled with certainty, having the effect of drawing in all very rural counties. For the first stratum (categories 1 through 7), a sample size of 400 is desired so as to be able to make some specific statements either about Metro versus NonMetro or comparisons across regions. However, we anticipate a response rate of about 20%. Therefore, a sample size of 2,000 election officials for the metro and nonmetro counties is proposed. The sample size distribution is given in Table 4 below. Note that the sample is distributed proportionately to the size of the stratum for these 28 strata (metro, nonmetro by region).


Table 4. Distribution of Sample for the Strata


Sample Sizes

 

 

Region

Total

Northeast

Midwest

South

West

 

1

185

55

109

20

369

2

170

43

83

22

317

3

90

54

93

26

263

4

92

34

48

15

189

5

37

23

17

12

89

6

124

95

173

30

423

7

131

92

84

43

351

8

68

75

117

33

293

9

30

224

132

63

449

Total

927

694

856

265

2,742


Note that the sample sizes for rows 8 and 9 are the population sizes because of sampling with certainty in these two rows. For rows 1 through 7, the sample size sums to 2,000.


With an expected response rate of 20%, the expected number of completed interviews is one-fifth the number in each cell. The expected number of completed interviews is given in Table 5.


Table 5: Expected Number of Completed Surveys by Stratum


Expected Number of Completed Surveys

 

 

Region

Total

Northeast

Midwest

South

West

 

1

37

11

22

4

74

2

34

9

17

4

63

3

18

11

19

5

53

4

18

7

10

3

38

5

7

5

3

2

18

6

25

19

35

6

85

7

26

18

17

9

70

8

14

15

23

7

59

9

6

45

26

13

90

Total

185

139

171

53

548


If the election officials that do respond to the survey are a random sample of the total population of election officials, then comparisons between subgroups would have the levels of precision presented in Table 6, below. These are the minimum detectable differences at the 95% confidence level.


Table 6. Minimum Detectable Differences at the 95% Confidence Level


Comparison

95% Difference Detectable

Rural to Metro \ NonMetro Combined

8.50%

Rural to Metro

9.80%

Rural to NonMetro

9.50%

Metro to NonMetro

9.20%



The results from this survey will only be representative of the survey respondents and cannot be statistically generalized to all LEOs. Any reports that contain results from this survey will clearly note these limitations. EAC designed this survey to obtain insights into practices and problems in rural and urban districts with the additional goals of identifying possible solutions and informing efforts at providing guidance and technical assistance to districts. EAC expects that the results from this survey will be useful for these purposes.


  1. Describe the procedures for the collection of information including: statistical methodology for stratification and sample selection; estimation procedure; degree of accuracy needed for the purpose described in the justification; unusual problems requiring specialized sampling procedures, and; any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Electronic Distribution and Follow-Up. A letter from a high-ranking EAC official will be sent to each sample member for whom there is an email address in the database to inform them that they will be receiving a survey invitation in the coming days, The letter will provide background on the purpose and importance of the effort. We anticipate that we will there will be a number of kickbacks in response to the survey announcement due to bad email addresses, with the issue most likely being a change of personnel. When a regular mailing address is available, these cases will be shifted to the portion of the sample receiving the paper-and-pencil survey. In the absence of an actual name, we will use a generic title (e.g., Elections Administrator, XXX County) in conducting the paper mailing. Both the email and regular mail databases will be updated accordingly.


Approximately three days after the introductory e-mail, a survey invitation will be sent via email to election officials in the online portion of the sample. Instructions for accessing the survey as well as their password will be included, as will information on how to obtain a paper copy of the survey. We will also attach a pdf version of the instrument. This will allow respondents to preview the content of the questionnaire. They will be instructed that, if they prefer, they can print out the file, complete the paper-based survey, and return it to the address provided.


Approximately one week after the initial email invitation, we will send a follow-up to all election officials, thanking those who have already completed the survey and encouraging non-responders to complete it at their earliest convenience. All information included in the first notification will be repeated here. A final email notification will be sent approximately one week prior to the close of the field period to give non-responders one last reminder to complete the survey by the deadline.


Each respondent will be given a unique password to enter the survey. This will allow them to start and stop the survey at their convenience. It also prevents users from completing the survey twice. Finally, it will allow us to identify respondents and non-respondents before initiating the subsequent notifications/mail outs.


Distribution of Paper Surveys. As mentioned, email addresses will not be available for approximately one-third of LEOs, and some portion of those that are available are likely to be invalid due to personnel turnover. In addition, some portion of the population will have limited Internet accessibility or knowledge, and therefore may prefer receiving a paper copy of the instrument. While respondents who prefer to complete the survey on paper have the option to print a hard copy of the questionnaire using the pdf file attached to notification emails, or to contact the distribution center to request a hard copy questionnaire, these option require some initiative and effort on their part.


That portion of the sample for whom email addresses are not available, along with those discovered to have invalid email addresses, will be sent a paper copy of the questionnaire. The survey mailing will include a cover letter on EAC letterhead, a paper version of the questionnaire printed in booklet format, and a postage-paid business reply envelope. Like the email invitation, the letter will be signed by an EAC official and printed on EAC letterhead. We will also include instructions for accessing the survey online along with a password. The cover letter and outside mailing envelope will be personalized for each election official to include his or her unique identification number, full name (when available), and mailing address. The unique identification number will also be printed on the survey booklet included in the mailing. To ensure that the survey mailing has been assembled correctly, envelopes will be randomly pulled for review. The information on the cover letter, outside mailing envelope and the questionnaire booklet will be compared to confirm that all information matches. Once the quality assurance check is completed, the survey packets will be mailed using first class postage.


To maximize the response rate, we will also mail a paper questionnaire to every election official who has not completed the survey via web by approximately three weeks after the initial email invitation.


  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


We have taken, and will take, a number of steps to ensure the highest return rate possible. First, we will use EAC’s database containing contact information of LEOs (email and postal address). Turnover among this population is high, so this is an essential component of ensuring adequate coverage. As mentioned, we intend to make first contact through a mailing with a letter signed by the Executive Director of the EAC which will stress the importance of this research and the vital role election officials play in making sure that the efforts of the Commission to support them in their mission are successful. We will also stress that we understand the pressures of their schedules and so placed a high priority on minimizing the time necessary to provide feedback. We are providing multiple means for responding to account for personal preferences among sample members in this regard. We are planning on several reminders, with a final follow-up of paper surveys to all nonrespondents. If there is adequate response to the initial request, such reminders will not be carried out.


One method for examining the issue of nonresponse bias is to contact some portion of the sample that did not participate within the survey field period and attempt to obtain their input, allowing for the comparison of their responses with those who complied with the original request. For several reasons, this is not a viable option in this instance. Each respondent will have been contacted on multiple occasions by the time the field period is closed and chose not to take part. There is no reason to assume that an additional request will yield sufficient data to do an adequate nonresponse analysis. The number complying could potentially be increased by using attention-getting methods of contact (e.g., express mailing the survey). However such methods are beyond the resources of the project. In addition, the risk of alienating members of the target population is real. Given that one of EAC’s roles is to provide LEOs with assistance in administering Federal elections, and that EAC often relies on feedback from these individuals, as well as their participation in workshops and other activities, it would be unwise to pursue the matter of survey participation to the point that could be perceived as harassment.


We will be able to provide some indication of the representativeness of the final survey sample through comparisons of certain demographic characteristics with the population as a whole. Specifically, we will compare survey respondents and nonrespondents in regard to geographic region, size of jurisdiction (number of registered voters), and urban/rural status. This will allow us to provide a picture of how representative the survey respondents are of the population as a whole on these variables, all of which are very germane to the subject of the survey itself.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the collection of information.

As mentioned previously, EAC staff conducted workshops with LEOs representing both urban and rural jurisdictions, as well as researchers and other experts in this field. Discussions centered on issues pertinent to this topic, and ideas were solicited regarding the dimensions of importance that required further study. Based on this input, a survey was created centering on the issues of importance. This instrument was reviewed by individuals experienced in survey design and suggested changes were incorporated. The draft instrument was then circulated to six LEOs identified by the EAC who were asked to complete it and take part in an interview to obtain their input on the clarity and comprehensiveness of the questions as well as suggestions for additional content. The outcomes of this process were reviewed by the project team, and alterations/additions were made where deemed advisable. After they were programmed for online administration, the instruments were tested extensively in house to guarantee that the functionality is operating properly and that the data were accurately recorded. Once this was completed, the survey was administered to six EAC staff, who were interviewed to determine if adjustments in the structure of the survey or the instructions are needed to increase accessibility and ease of use.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The following individuals were involved in/consulted with regarding the statistical aspects of the project:

  1. Dr. Karen Lynn Dyson
    Director of Research, Policy, and Programs
    Election Assistance Commission
    1225 New York Avenue, NW, Suite 1100

Washington, DC 20005
(202) 566-2123
klynndyson@eac.gov

  1. Dr. Shelly Anderson

Deputy Director of Research
Election Assistance Commission

1225 New York Avenue, NW, Suite 1100

Washington, DC 20005

(202) 566-3100

sanderson@eac.gov

  1. Mr. Matthew Weil
    Research Program Specialist
    Election Assistance Commission
    1225 New York Avenue, NW, Suite 1100
    Washington, DC 20005

mweil@eac.gov


  1. Dr. Peter Ramsberger
    Manager, Center for Personnel Policy Analysis
    Human Resources Research Organization

66 Canal Center Plaza, Suite 700
Alexandria, VA 22314
(703) 706-5686
pramsber@humrro.org

  1. Dr. Charles Cowan

Chief Executive Officer

Analytic Focus, LLC

4939 De Zavala Road, Suite 105

San Antonio, TX 78249
(210) 641-2817

c.cowan@analyticfocus.com

References


Alvarez, R. M., Ansolabehere, S., Antonsson, E., Bruck, J., Graves, S., Palfrey, T., Rivest, R., Selker, T., Slocum, T., & Stewart, C. (2001). Voting—What is, what could be. Pasadena, CA: Caltech-MIT Voting Technology Project. Retrieved from: http://www.vote.caltech.edu/drupal/files/report/voting_what_is_what_could_be.pdf


Creek, H. M.& Karnes, K. A. Karnes (2009). Federalism and election law: Implementation issues in rural America. Publius, 40 (2):275-295.


Gronke, P. & Caudell-Feagan, M. (2008). Data for democracy: Improving elections through metrics and measurement. Washington, DC: Pew Charitable Trusts.


Gronke, P., & Stewart, C. (2008). Basic principles of data collection. In P. Grimke & M. Caudell-Feagan (Eds.), Data for democracy: Improving elections through metrics and measurement. Washington, DC: Pew Charitable Trusts.


Kimball, D. C. & Kropf, M. (2006). The street-level bureaucrats of elections: Selection methods for Local Election Officials. Paper presented at the DeVoe Moor Center Critical Issues Symposium, February 10-11, 2006, Florida State University.


Liebschutz, S. & Palazzolo, D. J. (2005). HAVA and the states. Publius, 35 (4), 497-514.


Lipsky, M. (1983). Street-level bureaucracy: The dilemmas of the individual in public service. New York: Russell Sage Foundation.


Milbrath, L. W. (1965). Political participation: How and why do people get involved in politics? Chicago: Rand McNally


Monroe, A. D. (1977). Urbanism and voter turnout: A note on some unexpected findings. American Journal of Political Science, 21 (February 1977): 71-78.


Nie, N. H., Powell, G. B., & Prewitt, K. (1969). Social structure and political participation; Development relationships, part I. American Political Science Review, 63 (June): 361-378.


Rachlin, A. (Ed.).(2006). Making every vote count: Federal election legislation in the states. Princeton, NJ: Woodrow Wilson School of Public and International Affairs.


U.S. General Accounting Office. (2001). Report to congressional requesters. Election: perspectives on activities and challenges across the nation. Washington, DC: Author.


Verba, S., & Nie, N. H. (1972). Participation in America: Political democracy and social equality. New York: Harper and Row.

















Appendix A


Public Law 107-252


Subtitle C--Studies and Other Activities To Promote Effective

Administration of Federal Elections


Subtitle C--Studies and Other Activities To Promote Effective Administration of Federal Elections


SEC. 241. 42 USC 15381 PERIODIC STUDIES OF ELECTION ADMINISTRATION ISSUES.


(a) In General.--On such periodic basis as the Commission may determine, the Commission shall conduct and make available to the public studies regarding the election administration issues described in subsection (b), with the goal of promoting methods of voting and administering elections which--

(1) will be the most convenient, accessible, and easy to use for voters, including members of the uniformed services and overseas voters, individuals with disabilities, including the blind and visually impaired, and voters with limited proficiency in the English language;

(2) will yield the most accurate, secure, and expeditious system for voting and tabulating election results;

(3) will be nondiscriminatory and afford each registered and eligible voter an equal opportunity to vote and to have that vote counted; and

(4) will be efficient and cost-effective for use.


(b) Election Administration Issues Described.--For purposes of subsection (a), the election administration issues described in this subsection are as follows:

(1) Methods and mechanisms of election technology and voting systems used in voting and counting votes in elections for Federal office, including the over-vote and under-vote notification capabilities of such technology and systems.

(2) Ballot designs for elections for Federal office.

(3) Methods of voter registration, maintaining secure and accurate lists of registered voters (including the establishment of a centralized, interactive, statewide voter registration list linked to relevant agencies and all polling sites), and ensuring that registered voters appear on the voter registration list at the appropriate polling site.

(4) Methods of conducting provisional voting.

(5) Methods of ensuring the accessibility of voting, registration, polling places, and voting equipment to all voters, including individuals with disabilities (including the blind and visually impaired), Native American or Alaska Native citizens, and voters with limited proficiency in the English language.

(6) Nationwide statistics and methods of identifying, deterring, and investigating voting fraud in elections for Federal office.

(7) Identifying, deterring, and investigating methods of voter intimidation.

(8) Methods of recruiting, training, and improving the performance of poll workers.

(9) Methods of educating voters about the process of registering to vote and voting, the operation of voting mechanisms, the location of polling places, and all other aspects of participating in elections.

(10) The feasibility and advisability of conducting elections for Federal office on different days, at different places, and during different hours, including the advisability of establishing a uniform poll closing time and establishing--

(A) a legal public holiday under section 6103 of title 5, United States Code, as the date on which general elections for Federal office are held;

(B) the Tuesday next after the 1st Monday in November, in every even numbered year, as a legal public holiday under such section;

(C) a date other than the Tuesday next after the 1st Monday in November, in every even numbered year as the date on which general elections for Federal office are held; and

(D) any date described in subparagraph (C) as a legal public holiday under such section.

(11) Federal and State laws governing the eligibility of persons to vote.

(12) Ways that the Federal Government can best assist State and local authorities to improve the administration of elections for Federal office and what levels of funding would be necessary to provide such assistance.

(13)

(A) The laws and procedures used by each State that govern--

(i) recounts of ballots cast in elections for Federal office;

(ii) contests of determinations regarding whether votes are counted in such elections; and

(iii) standards that define what will constitute a vote on each type of voting equipment used in the State to conduct elections for Federal office.

(B) The best practices (as identified by the Commission) that are used by States with respect to the recounts and contests described in clause (i).

(C) Whether or not there is a need for more consistency among State recount and contest procedures used with respect to elections for Federal office.

(14) The technical feasibility of providing voting materials in eight or more languages for voters who speak those languages and who have limited English proficiency.

(15) Matters particularly relevant to voting and administering elections in rural and urban areas.

(16) Methods of voter registration for members of the uniformed services and overseas voters, and methods of ensuring that such voters receive timely ballots that will be properly and expeditiously handled and counted.

(17) The best methods for establishing voting system performance benchmarks, expressed as a percentage of residual vote in the Federal contest at the top of the ballot.

(18) Broadcasting practices that may result in the broadcast of false information concerning the location or time of operation of a polling place.

(19) Such other matters as the Commission determines are appropriate.

















Appendix B


Federal Register Notices



























Appendix C


United States Election Assistance Commission

Urban/Rural Study

Survey Cover Letters/Announcements and Survey Instrument



U. S. ELECTION ASSISTANCE COMMISSION

1201 New York Avenue, NW, Suite 300

Washington, DC 20005


[ADVANCE LETTER]

MONTH DD, 2012


Dear Local Election Official,


As you may be aware, HAVA(b)(15) requires the U.S. Election Assistance Commission (EAC) to study matters particularly relevant to voting and administering elections in rural and urban areas. The purpose of the Survey of Rural and Urban Election Administration is to determine the ways in which election officials conduct voter outreach, secure personnel, and handle any cost-related challenges associate with administering general elections in rural and urban jurisdictions.


The EAC has engaged the services of an independent contractor, Human Resources Research Organization (HumRRO), to work with us to develop and implement this survey. From the survey data, we hope to identify creative and innovative approaches that some jurisdictions have implemented to address the challenges of voter outreach and personnel, and to gain a greater understanding of how urban and rural jurisdictions differ in their approaches to these election administration challenges.


I am writing in advance to encourage you to participate in this survey. In the coming days, you will receive a mailing from HumRRO that includes the survey and a postage-paid business reply envelope.


Your responses are completely anonymous; data will be reported to the EAC on an aggregate basis only. No data will be reported that will allow for the identification of an individual respondent.


Thank you in advance for your participation in this important effort. Your input is invaluable as we seek to provide best practices information to assist with the issues you face in the administration of Federal elections.



Sincerely,





Karen Lynn-Dyson

Director, Research, Policy and Programs Division

U.S. Election Assistance Commission






ADVANCE E-MAIL





MONTH DD, 2012


Dear Local Election Official,


As you may be aware, HAVA(b)(15) requires the EAC to study matters particularly relevant to voting and administering elections in rural and urban areas. The purpose of the Survey of Rural and Urban Election Administration is to determine the ways in which election officials conduct voter outreach, secure personnel, and handle any cost-related challenges associate with administering general elections in rural and urban jurisdictions.


The EAC has engaged the services of an independent contractor, Human Resources Research Organization (HumRRO), to work with us to develop and implement this survey. From the survey data, we hope to identify creative and innovative approaches that some jurisdictions have implemented to address the challenges of voter outreach and personnel, and to gain a greater understanding of how urban and rural jurisdictions differ in their approaches to these election administration challenges.


I am writing in advance to encourage you to participate in this survey. In the coming days, you will receive an email from HumRRO that includes the survey link and your unique password.


Your responses are completely anonymous; data will be reported to the EAC on an aggregate basis only. No data will be reported that will allow for the identification of an individual respondent.


Thank you in advance for your participation in this important effort. Your input is invaluable as we seek to provide best practices information to assist with the issues you face in the administration of Federal elections.



Sincerely,




Karen Lynn-Dyson

Director, Research, Policy and Programs Division

U.S. Election Assistance Commission



INVITATION LETTER (on HumRRO letterhead)




MONTH DD, 2012


Dear Local Election Official,


The Human Resources Research Organization (HumRRO) is conducting a survey of Local Election Officials on behalf of the United States Election Assistance Commission (EAC). The enclosed letter from the Director of the Research, Policy, and Programs Division of EAC explains the purpose and importance of this effort.


I am writing today to encourage you to participate in this survey. A copy of the survey is enclosed, along with a postage-paid envelope addressed to HumRRO for you to return your completed survey. If you would prefer to complete the survey online, the survey link and your password have been provided below.


If you have any questions, please call HumRRO, toll free, at (800) 301-1508 and ask for the ‘EAC Survey Coordinator’ or send an email to EACstudy@humrro.org.


Please complete and mail the survey by August XX, 2012. Thank you in advance for you participation.



Sincerely,




Gwenyth M. Van Trieste

Survey Manager



To access the survey, go to: https://apps.humrro.org/XXXXXX


Your password is: XXXXXX

(Please note that the password is case sensitive, so you must enter numbers and upper and lower case letters exactly as shown.)



[ENCLOSE ADVANCE LETTER ON EAC LETTERHEAD]

[ENCLOSE SURVEY WITH ID]

[ENCLOSE BRE]


INVITATION EMAIL


MONTH DD, 2012


Dear Local Election Official,


The Human Resources Research Organization (HumRRO) is conducting a survey of Local Election Officials on behalf of the United States Election Assistance Commission (EAC). The attached letter from the Director of the Research, Policy, and Programs Division of EAC explains the purpose and importance of this effort. Please begin by reading this letter.


This anonymous survey should take no more than 30 minutes to complete. You will have the ability to leave and return where you left off, if that is necessary. To access the survey, type, or copy and paste, the following URL in the address bar of your browser:


https://apps.humrro.org/XXXXXX


You have been assigned an individual password to gain access to the survey. Please enter it in the space provided. Please note that the password is case sensitive, so you must enter numbers and upper and lower case letters exactly as shown below:


Your password is: XXXXXX


If you prefer to complete the survey on paper, a copy of the survey is attached to this email. Please feel free to print the survey and return it to HumRRO by mail or fax.


Mail to: HumRRO

PO Box 6640

Lawrenceville, NJ 08648


Fax to: 609-512-3730


If you have any questions, please send an email to EACstudy@humrro.org or call HumRRO, toll free, at (800) 301-1508 and ask for the ‘EAC Survey Coordinator.’


If possible, please complete the survey by August XX, 2012. Thank you in advance for you participation.


Sincerely,


Gwenyth M. Van Trieste

Survey Manager


[ATTACH ADVANCE LETTER .pdf]

[ATTACH SURVEY .pdf]







United States Election Assistance Commission

Urban/Rural Study


Local Election Officials Survey





ABOUT THIS SURVEY

The United States Election Assistance Commission (EAC) was created as part of the Help America Vote Act (HAVA) to assist State and local election officials with the administration of Federal elections. HAVA (b)(15) requires EAC to study “[m]atters particularly relevant to voting and administering elections in rural and urban areas.” The purpose of Survey of Rural and Urban Election Administration is to determine the ways in which election officials conduct voter outreach, secure personnel, and handle any cost-related challenges associated with administering general elections in rural and urban jurisdictions. You will be asked questions about your jurisdictions; however, they are for research purposes only and are not connected to any enforcement activity on the part of other Federal agencies.


Shape2

In 2009 and 2010, EAC conducted two working groups with election officials from rural and urban communities and with social science researchers. The purpose of the working groups was to gain perspective and feedback on how EAC might approach this study. The working group members spent their time considering current challenges related to administering elections in urban and rural areas. Potential areas for future study they identified included voter outreach and personnel (along with costs related to these factors). Voter outreach and personnel are examples of areas in which jurisdictions are demonstrating creativity and innovation in responding to election administration challenges and, therefore, may present an interesting contrast when considered in the context of urban and rural election administration. These are also areas where cost savings can be realized. Highlighting these topics in EAC’s report will provide a greater understanding of how urban and rural areas differ on these issues and might help to provide best practices information for election officials around the country.




















Your input in this study is very important. This survey should take 20 minutes to complete. Please respond to all applicable questions. In addition, we ask that if you would be willing to participate in an in-depth interview regarding the topic of this survey, please indicate this at the end of the survey.





Background



  1. How long have you served as an election official? (include total experience in all jurisdictions)


_______ number of years



  1. Approximately how many registered voters reside in the jurisdiction you currently serve?


_______ approximate number of registered voters



  1. How would you describe your jurisdiction? Is it primarily rural or primarily urban?


Rural

Urban

Both. My jurisdiction includes both rural and urban areas.



  1. Is your jurisdiction required to provide language assistance under Section 203 of the Voting Rights Act?


  • Yes – Go to question 4a

  • No – Skip to question 5


4(a). If yes, for which languages or language groups is your jurisdiction required to provide assistance? (Check all that apply)

○ Spanish

○ Asian languages

○ Alaskan / Native American languages

○ Other (please specify) ____________________



  1. Does your office have full/ultimate responsibility for all aspects of elections in your jurisdiction (e.g., voter registration, voting machines, ballots, vote counting, etc.)? Please note that your office may have full/ultimate responsibility for an election-related activity even if it is not actually performed in your office (e.g., computer-related support).


  • Yes – Skip to question 6

  • No – Go to question 5a


5(a). (If no) Is full/ultimate responsibility for all aspects of elections in your jurisdiction…

○ A state function only

○ A shared state and local function

○ Other (please specify)________________________


  1. Please indicate whether or not each of the following is allowed in your jurisdiction.



Yes

No

Absentee voting (excuse required)

No-excuse absentee voting

Early voting

All vote-by-mail



Voter Outreach


The next series of questions is about voter outreach activities. For purposes of this survey, please consider voter outreach to be any activity that your office engages in to provide information to the voting public. This includes information your office is required to provide and responses to information requests from individuals and/or organizations.


  1. For each of the following, please indicate whether your office provides this type of outreach to the voting public. If your office provides this outreach, please indicate the language(s) in which it is provided.


Type of Outreach

Does your office provide this type of outreach?

If YES, in what languages is the outreach provided?

Yes


No


English only

English and other languages

Other languages only

Paid print advertising (e.g., newspaper)

Paid television/radio advertising

Elections Office/County website

Hard copy direct mailing to voters (e.g., voter’s guide, sample ballot)

Toll-free telephone line

Social media (e.g., Facebook, Twitter, blogs)

Participating in community events

Other (please specify)

______________________________________




  1. Does your jurisdiction form partnerships with any third-party or civic organizations on voter outreach efforts?


Yes – Continue to question 8a

No – Skip to question 9



8a. For each of the following, please indicate whether your jurisdiction forms partnerships with other organizations on this type of voter outreach effort.

Types of Outreach

Conduct with other organizations

Do not conduct with other organizations

Paid print advertising (e.g., newspaper)

Paid television/radio advertising

Elections Office/County website

Hard copy direct mailing to voters (e.g., voter’s guide, sample ballot)

Toll-free telephone line

Social media (e.g., Facebook, Twitter, Blogs)

Participating in community events

Other (please specify)

___________________________________________________


Other (please specify)

___________________________________________________




8b. Please indicate whether your jurisdiction forms partnerships with each of the following types of organizations on voter outreach efforts.


Types of Organization

Conduct with this type of organization

Do not conduct with this type of organization

School-related organization(s)

Non-profit organization(s)

Political parties

Other type of organization (please specify)

___________________________________________


Other type of organization (please specify)

___________________________________________


8c. We are particularly interested in voter outreach efforts that jurisdictions have provided in partnership with other organizations. Please provide further information about these efforts.








  1. For each of the following, please indicate whether your jurisdiction has voter outreach initiatives or activities that focus on this group.


Voter outreach focus

Focus on this group

Do not focus on this group

Students

Racial/ethnic minorities

Foreign language speakers

Voters in long-term care facilities

Voters with disabilities

Other group (please specify)

____________________________________________


Other group (please specify)

____________________________________________




10. Approximately how much did voter outreach efforts for the 2010 Mid-Term Election cost your jurisdiction?


$10,000 or less ○ $60,001 – 70,000

$10,001 – 20,000 ○ $70,001 – 80,000

$20,001 – 30,000 ○ $80,001 – 90,000

$30,001 – 40,000 ○ $90,001 –100,000

$40,001 – 50,000 ○ $100,001 – 200,000

$50,001 – 60,000 ○ $200,001 or more


  1. Approximately how much do you anticipate voter outreach efforts for the 2012 General Election will cost your jurisdiction?


$10,000 or less ○ $80,001 – 90,000

$10,001 – 20,000 ○ $90,001 – 100,000

$20,001 – 30,000 ○ $100,001 – 200,000

$30,001 – 40,000 ○ $200,001 – 300,000

$40,001 – 50,000 ○ $300,001 – 400,000

$50,001 – 60,000 ○ $400,001 – 500,000

$60,001 – 70,000 ○ $500,001 or more

$70,001 – 80,000

  1. How are your jurisdiction’s voter outreach efforts paid for? (Check all that apply)


From the local election office budget

From fees collected for other office responsibilities (e.g., rental of equipment, selling voter lists, etc.)

From line item appropriation in the county or state budget

Other (please specify) __________________________________________________



  1. In general, how easy or difficult is it for your jurisdiction to engage in voter outreach for general election cycles?


Very easy

Somewhat easy

Neither easy nor difficult

Somewhat difficult

Very difficult


  1. How much of a problem is each of the following in engaging in voter outreach for general election cycles?



A big problem

A moderate problem

A small problem

Not a problem at all

Cost

Staff availability/time

Availability of media outlets

Travel distance required for in-person contact

Limitations on Internet access or reliability

Variety of languages spoken

Other (please specify)

________________________


Other (please specify)

________________________





Personnel


15. Please indicate how many of each of the following types of paid staff you had in 2010.


  1. In 2010, approximately how many paid full-time (permanent) staff did you have?


_______ number of paid full-time (permanent) staff



b. In 2010, approximately how many paid part-time (permanent) staff did you have?


_______ number of paid part-time (permanent) staff


  1. In 2010, approximately how many paid temporary staff did you have (e.g., workers who come in around election time to help with administrative tasks such as data entry for voter registration, work the customer service hotline, etc.)? Please do NOT include poll workers.


_______ number of paid temporary staff


  1. In 2010 did you “borrow” staff from other departments within your local/municipal government to supplement your full-time, part-time, and temporary staff?


Yes (please indicate approximate number of staff) ___________

No


16. For the 2010 General Election, approximately how many poll workers/election judges did your office use?


_______ number of poll workers/election judges


17. Please indicate how many of each of the following types of paid staff you anticipate having in 2012.


  1. In 2012, approximately how many paid full-time (permanent) staff do you anticipate having?


_______ number of paid full-time (permanent) staff


b. In 2012, approximately how many paid part-time (permanent) staff do you anticipate having?


_______ number of paid part-time (permanent) staff


c. In 2012, approximately how many paid temporary staff do you anticipate having (e.g., workers who come in around election time to help with administrative tasks such as data entry for voter registration, work the customer service hotline, etc.)? Please do NOT include poll workers.


_______ number of paid temporary staff


  1. In 2012 do you anticipate “borrowing” staff from other departments within your local/municipal government to supplement your full-time, part-time, and temporary staff?


Yes (please indicate approximate number of staff) ___________

No


18. For the 2012 General Election, approximately how many poll workers/election judges do you anticipate your jurisdiction will use?


_______ number of poll workers/election judges


NOTE: For Questions 19 and 20, “poll workers” does not include Chief, Assistant Chief, Judges of Elections, Captains, or Supervisors; only poll workers.


19. Are your poll workers paid for their work on Election Day?


Yes – Continue to question 19a

No – Skip to question 20



19a. How much are your poll workers paid for their work on Election Day?

One-time set stipend of $________

Hourly rate in the amount of $___________per hour


20. Are your poll workers paid for training?


Yes – Continue to question 20a

No – Skip to question 21



20a. How much are your poll workers paid for training?

One-time set stipend of $________

Hourly rate in the amount of $___________per hour

Payment in question 19a includes training pay



21. Please indicate which recruiting sources you use to obtain poll workers for General Elections. For each source used, please indicate how successful the source has been for you in obtaining poll workers.


Recruiting Source

Do you use this source?

If you use this source, rate successfulness

YES

NO

Very Successful

Successful

Somewhat Successful

Not Successful

Classified ads

Recruiting at college campuses

Recruiting at high schools

Recruiting through website

Recruiting through local businesses

Recruiting through volunteer organizations

Recruiting through other government agencies/departments

Recruiting through word of mouth (e.g., current poll workers encourage friends/coworkers to volunteer)

Responding to requests from individuals or groups regarding becoming poll workers

Other (please specify)

________________________


Other (please specify)

________________________





22. In general, how easy or difficult is it for your jurisdiction to obtain a sufficient number of poll workers for general election cycles?


Very easy

Somewhat easy

Neither easy nor difficult

Somewhat difficult

Very difficult


23. For each of the following, please indicate how much of a problem it presents in obtaining a sufficient number of poll workers for general election cycles?



A big problem

A moderate problem

A small problem

Not a problem at all

Payment is too low

Election Day work hours are too long

Little respect for poll workers

Training is too long/takes too much time

Potential poll workers cannot get off from work to serve

Requirement for equal numbers of poll workers from different political parties

Lack skilled or qualified workers

Other (please specify)

__________________________________


Other (please specify)

__________________________________




24. Does your jurisdiction offer split shifts for poll workers on Election Day? That is, can poll workers sign up to work less than a full day at the polls on Election Day?

Yes – Go to question 24a

No – Go to question 24b


24a. (If split shifts are offered) What impact does the ability to offer split shifts have on your recruiting poll workers?


Makes it much easier to recruit poll workers.

Makes it somewhat easier to recruit poll workers.

Has no impact.


24b. (If split shifts are not offered) What impact would the ability to offer split shifts have on your recruiting poll workers?


Would make it much easier to recruit poll workers.

Would make it somewhat easier to recruit poll workers.

Would have no impact.



25. Please provide any additional comments you may have about administering elections in urban and rural jurisdictions. In particular, we are interested in any ideas and/or experience you have regarding voter outreach and personnel that you feel had a positive impact on your ability to administer general elections.















Thank you for participating in this survey.


EAC is planning to conduct in-depth follow-up interviews regarding the topics addressed in this survey. If you would be willing to take part in an in-person interview concerning the same topic, check this box and provide your contact information below. Please note that your contact information will be separated from the answers you have provided in the survey and will be used only to contact you for a follow-up interview.


Name: _______________________________________________________________


Phone: ____________________________


E-mail: ____________________________


Instructions


After you have completed the survey, please place the questionnaire in the postage-paid envelope provided and return it in the mail to:


HumRRO

P.O. Box 6640

Lawrenceville, NJ 08640


If you prefer, you may fax the completed survey to HumRRO at 609-512-3730.



1 In 2011 EAC developed a Statement of Work and issued a competitive RFQ for the development, execution and analysis of a survey to be administered to local election officials regarding the administration of elections in rural and urban areas. In late 2011 a contract was awarded to HumRRo to conduct the survey and perform an analysis of the findings in advance of the 2012 General Election. EAC’s expectation has been that HumRRO would deliver the survey findings and analysis by the end of calendar year 2012.

2 If future EAC funding were to allow, the second phase might involve EAC staff conducting in-depth field interviews with a select number of local election officials located in certain urban and rural locales.


i


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleJustification
AuthorPeter Ramsberger
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy