Download:
pdf |
pdfReport Date: June 29, 2012
SURVEY REPORT
Note No. 2012-010
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse
Bias Study
Introduction
In accordance with the Office of Management and Budget’s recommendation, DMDC performed a
nonresponse bias (NRB) study of the 2010 Post-Election Voting Survey of Local Election Officials.
NRB is the concept that respondents to a survey may be systematically different than non-respondents,
causing the estimates for the survey not to be representative of the entire population. The goal of this
study was to determine to what extent NRB existed in these survey estimates. To gain participation
from non-respondents to the production survey, DMDC created an abbreviated form of the survey and
called jurisdictions to collect the data by telephone. Survey interviewers guided respondents to a
website where respondents could read the survey questions for assistance.
Based on the results of the study, it does appear that NRB affects at least a portion of the questions on
the 2010 Post-Election Voting Survey of Local Election Officials. Differences in the means of unweighted data from the production and NRB surveys suggest that estimates are dependent on the actual
jurisdictions that respond. However, the variability within the weighting process could be significantly
reduced given a variable that is better correlated with Uniformed and Overseas Citizen Absentee
Voting Act (UOCAVA) voters, which are the focus of the survey.
Design
Production Survey
The production 2010 Post-Election Voting Survey of Local Election Officials was a census of all 7,296
jurisdictions in the 50 states and four U.S. territories. DMDC developed the sampling frame from
three sources: 1) a file provided by the Federal Voting Assistance Program (FVAP), 2) state election
website research and 3) website research from the Overseas Vote Foundation (OVF). For weighting
purposes, the jurisdictions were stratified based on the number of registered voters. In order to
encourage participation from the largest jurisdictions, FVAP attempted to call the 1,000 jurisdictions
with the most registered voters based on administrative data and directed them to the website. All
jurisdictions received postal notifications and a paper survey as well as email notifications to complete
the web survey. 450 jurisdictions had no email address on file, while 1,891 jurisdictions had invalid email addresses. The remaining 4,955 jurisdictions had at least one valid email. The production survey
fielded from November 30, 2010 to February 16, 2011. Of the 7,296 jurisdictions in the 50 states and
four US territories, 3,894 jurisdictions responded to the production survey, leaving 3,402 nonresponding jurisdictions.
Nonresponse Bias Study
Of the non-respondents to the production survey, those that returned a blank survey, were postal nondeliverable, or did not return a survey were considered eligible for the non-response study. 500
respondents to the nonresponse bias study were desired to study the possible existence of NRB in the
2010 Post-Election Voting Survey of Local Election Officials. Based on an assumed response rate of
1
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
approximately 50 percent to the nonresponse bias study, DMDC sampled 1,000 eligible nonrespondents. The sample size was dictated by budget constraints as opposed to variance implications.
An optimal allocation was used, which determines the best sample allocation based on population size
and variance within each stratum as defined in the production survey. Due to the small amount of
large jurisdictions and large variance within those strata, any non-respondent jurisdiction with more
than 40,000 registered voters was included in the sample with certainty. The remaining nonresponding jurisdictions were selected with sampling fractions that were determined by the optimal
allocation. These sampling fractions can be seen in Table 4. Each jurisdiction in the sample was then
called and directed to the web site to complete an abbreviated version of the survey. The nonresponse
bias study was fielded from June 1 through June 22, 2011. The survey fielding period was delayed,
which may have an effect on survey measurement, which is discussed in the Nonresponse Bias Study
Results section of this paper.
Table 1.
Sample Size and Percent by Disposition Code for the Two Surveys
Production Survey
Disposition Code
1 Record ineligible based on sample filea
2 Ineligible--Self or proxy reportb
3 Ineligible--Survey self reportc
4 Complete eligible response
5 Incomplete eligible responsed
8 Refused/othere
9 Blank
10 Postal non-deliverable (PND)f
11 Non-respondents
Total
Frequency
0
0
0
3,894
0
614
208
85
2,495
7,296
Nonresponse Bias
Percent
0%
0%
0%
53%
0%
8%
3%
1%
34%
100%
Frequency
0
0
0
249
0
98
0
0
653
1,000
Percent
0%
0%
0%
25%
0%
10%
0%
0%
65%
100%
a
The population file for jurisdictions was created months in advance. Had jurisdictions been redrawn in the time before fielding, a jurisdiction on the file
could have become record ineligible. This disposition code tends to be more prevalent on personnel surveys, when members leave the service or are
promoted beyond the scope of the survey between drawing the sample and fielding the survey.
b
If a jurisdiction had contacted the data collection agency and claimed to be ineligible for the survey; they would receive a disposition code 3, which
means ineligible by means of self or proxy report.
c
Jurisdictions would become survey self report ineligible if their answers to the survey questions had indicated they should not be included. This
disposition code is more commonly used in surveys of military members who, for instance, may have left the military when the survey fields.
d
Due to the imputation scheme applied, any jurisdiction that responded to at least one item was considered complete.
e
Refusals to complete the survey are not treated as nonresponses. Therefore, those with a disposition code of 8 are not eligible for the NRB study.
f
In the production survey, PNDs are based on the mailing address for the jurisdiction. In the NRB study, PNDs are based on telephone numbers.
Response rates to the production survey were higher than for the nonresponse survey, as Table 1shows
that jurisdictions were more likely to become eligible respondents in the production survey (53%) than
in the nonresponse bias survey (25%). The impact of telephone calls on encouraging participation in
the NRB study from those jurisdictions that did not complete the production survey is limited due to
the phone calls already made to the largest 1,000 jurisdictions during the production survey. 1 Table 5
shows the breakdown of response rates by jurisdiction size for each of the surveys.
1
All jurisdictions with more than 29,202 registered voters received phone calls encouraging them to complete the
production survey. Therefore, an increase in response propensity attributed to the telephone contact method, as opposed to
an increase in number of contacts, should have been captured in the production survey. This is reflected in the higher
response rates for larger jurisdictions in the production survey and similar response rates for all strata in the NRB study, as
shown in Table 5.
2
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Table 3 shows the un-weighted response rates for each of the surveys, calculated in accordance with
the American Association for Public Opinion Research (AAPOR) RR3 recommendations. 2
Location, completion, and response rates were computed as follows:
The location rate (LR) is defined as
LR =
adjusted located sample N L
=
.
adjusted eligible sample N E
The completion rate (CR) is defined as
CR =
N
usable responses
= R.
adjusted located sample N L
The response rate (RR) is defined as
RR =
N
usable responses
= R.
adjusted eligible sample N E
where
•
NL = Adjusted located sample.
•
NE = Adjusted eligible sample.
•
NR = Usable responses.
To identify the cases that contribute to the components of LR, CR, and RR, the disposition
codes were grouped as shown in Table 2. Record ineligibles were excluded from calculation of the
eligibility rate.
Table 2.
Disposition Codes for AAPOR Response Rates
Response Category
Survey Disposition Code
4, 5, 8, 9, 10, 11
4, 5, 8, 9, 11
4
11
2, 3, 4, 5, 8, 9
2, 3
a
Eligible Sample
Located Samplea
Eligible Response
No Return
Eligibility Determined
Self-Reported Ineligible
a
The criterion for a complete respondent was response to any survey item. Therefore, there were no incomplete respondents (disposition code value 5).
2
The American Association for Public Opinion Research. 2011. Standard Definitions: Final Dispositions of Case Codes
and Outcome Rates for Surveys. 7th edition. AAPOR.
3
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Ineligibility Rate
The ineligibility rate (IR) is defined as
IR =
Self - Reported Ineligible
.
Eligible determined
Estimated Ineligible Postal Non-Deliverable/Not Located Rate
The estimated ineligible postal non-deliverable not located rate (IPNDR) is defined as
IPNDR = (Eligible Sample − Located Sample ) * IR.
Estimated Ineligible Nonresponse
The estimated ineligible nonresponse (EINR) is defined as
EINR = (Not Returned ) * IR.
Adjusted Location Rate
The adjusted location rate (ALR) is defined as
ALR =
( Located Sample − EINR )
.
( Eligible Sample − IPNDR − EINR )
Adjusted Completion Rate
The adjusted completion rate (ACR) is defined as
ACR =
( Eligible response )
.
( Located Sample − EINR )
Adjusted Response Rate
The adjusted response rate (ARR) is defined as
ARR =
( Eligible response )
.
( Eligible Sample − IPNDR − EINR )
Table 3.
Location, Completion, and Response Rates for the Two Surveys
Rate
Adjusted Location Ratea
a
Production
Nonresponse Bias
99%
100%
Adjusted Completion Rate
54%
25%
Adjusted Response Rate
53%
25%
Location rate for the production survey is based on mailing address and for the NRB study is based on working telephone numbers.
4
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Sample Composition
In order to make comparisons between these different surveys, especially in terms of un-weighted
response data, the composition of the respondents should be examined. The demographic breakdown
by jurisdiction size for each of the surveys is shown in Table 4.
Table 4.
Distribution of Sample and Respondents by Jurisdiction Size for the Two Surveys
Number of
Registered
Voters
<=5,000
5,001-10,000
10,001-29,202a
29,203-40,000
40,001-75,000
75,001-100,000
100,001-200,000
200,001-360,000
>360,001
Total
Original Survey
Nonresponse Bias Study
Populatio
Percent Respondents Percent Sample Percent Sampling Fraction Respondents Percent
n
4,200
57.60%
2,167
55.60%
335
33.50%
0.20
87
34.90%
829
11.40%
436
11.20%
71
7.10%
0.23
14
5.60%
1,267
17.40%
237
3.20%
319
4.40%
102
1.40%
162
2.20%
84
1.20%
96
1.30%
7,296 100.00%
663
17.00%
285
136
3.50%
197
5.10%
54
1.40%
112
2.90%
66
1.70%
63
1.60%
3,894 100.00%
28.50%
0.57
73
7.30%
100
10.00%
44
4.40%
45
4.50%
16
1.60%
31
3.10%
1,000 100.00%
0.85
1.00
1.00
1.00
1.00
1.00
74
29.70%
14
5.60%
25
10.00%
16
6.40%
8
3.20%
6
2.40%
5
2.00%
249 100.00%
a
To encourage response from large jurisdictions in the production survey, the largest 1,000 jurisdictions, which included all jurisdictions with more than
29,202 registered voters, were called. To capture the effect of these calls on response propensity, the strata were created so that none of the largest 1,000
jurisdictions is in a stratum with a jurisdiction that did not receive a call. The NRB strata were defined in the same way to allow comparison.
In each of the two surveys, the breakdown by jurisdiction size does not vary greatly between the
sample and respondents. 3 This is due to the much larger number of small jurisdictions, which mask the
increased response propensity of large jurisdictions in the production survey. Note that in the 2008
survey response rates for large jurisdictions were lower than for smaller jurisdictions, a trend that was
reversed in 2010 due to the use of telephone calls to large jurisdictions. 4 The response rates for the
2010 surveys are shown in Table 5 and indicate that the phone calls to the largest jurisdictions in the
production survey were effective at gaining participation from those jurisdictions. Also, because the
demographics are different between the two surveys, such as 56% of production-survey respondents
have fewer than 5,000 registered voters but only 35% of the NRB respondents are in that stratum,
unweighted estimates may be different even if no NRB is present because the NRB sample design is
disproportionately large jurisdictions. Therefore, to test for NRB by comparing estimates from the two
surveys, weighting the data is necessary.
Subgroup Response Rates
Table 5 shows the response rates by jurisdiction size for both surveys. Larger jurisdictions tend to
have higher response rates in the production survey, indicating that the calls to the 1,000 largest
jurisdictions did induce participation.
3
In 2008, DMDC observed that large jurisdiction have lower response rates than smaller jurisdictions. In 2010, it appears
that the addition of phone calls raises the level of response for large jurisdictions to roughly equal other jurisdictions.
4
“Table 11. Rates for Full Sample and Stratification Levels.” DMDC. (2009). 2008 Post-Election Survey of Local Election
Officials: Statistical Methodology Report. (Report No. 2009-053). Arlington, VA.
5
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Table 5.
Response Rates by Jurisdiction Size
Number of
Registered
Voters
<=5,000
5,001-10,000
10,001-29,202a
29,203-40,000
40,001-75,000
75,001-100,000
100,001-200,000
200,001-360,000
360,001+
Total
Production Survey
Population
Respondents
4,200
829
1,267
237
319
102
162
84
96
7,296
Nonresponse Bias Study
Percent
Sample
52%
53%
52%
57%
62%
53%
69%
79%
66%
53%
2,167
436
663
136
197
54
112
66
63
3,894
Respondents
335
71
285
73
100
44
45
16
31b
1,000
Percent
87
14
74
14
25
16
8
6
5
249
26%
20%
26%
19%
25%
36%
18%
38%
16%
25%
a
To encourage response from large jurisdictions in the production survey, the largest 1,000 jurisdictions, which included all jurisdictions with more than
29,202 registered voters, were called. To capture the effect of these calls on response propensity, the strata were created so that none of the largest 1,000
jurisdictions is in a stratum with a jurisdiction that did not receive a call. The NRB strata were defined in the same way to allow comparison.
b
Due to the telephone calls, LEOs in large jurisdictions were more likely to refuse the survey. Although non-responding large jurisdictions were sampled
with certainty for the NRB study, those that refused the production survey were excluded.
Nonresponse Bias Study Results
To determine the effectiveness of the weighting process in reducing NRB, it can be useful to first
compare unweighted data. Based on the un-weighted data for the 2010 Post-Election Voting Survey of
Local Election Officials, the means for some of the questions do have considerable differences, and as
a result may contain NRB. Questions that are based on overall voters, as opposed to broken into
groups such as Uniformed Service members, and have means greater than 1 in at least one of the
surveys are shown in Table 6. See Appendices A and B for the production survey questionnaire and
NRB study questionnaire, respectively.
Table 6.
Un-weighted Data by Question
Production
Survey Mean
Question
Total registered UOCAVA votersc
Total participating UOCAVA voters
c
Nonresponse
Bias Study
Meana
Percent
Differenceb
235
112
71 %
96
125
26%
Regular UOCAVA absentee ballots returnedc
33
34
2%
c
2
1
44%
Regular UOCAVA absentee ballots rejected
Regular UOCAVA absentee ballots counted
c
31
31
2%
Federal Write-In Absentee Ballots returned by UOCAVA voters
c
1
2
62%
Federal Write-In Absentee Ballots counted for UOCAVA voters
c
1
3
123%
a
While the production survey data has been imputed, the NRB study data has not.
b
Percent difference is defined as the absolute value of 100*(Production Survey Mean – NRB Mean) / ((Production Survey Mean + NRB Mean)/2)
c
UOCAVA voters are voters who are covered by the Uniformed and Overseas Citizen Absentee Voting Act.
6
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Due to the differing number of respondents between the two surveys, unweighted totals to these
questions are incomparable. Therefore, Table 6 displays the Unweighted mean value from each
question for survey respondents. Based on these means, NRB does seem to exist in at least the total
number of registered and participating UOCAVA voters. However, the presence of NRB can at least
partially be explained by the differing distributions of eligible respondents to the two surveys: 23% of
the responding jurisdictions in the NRB survey had over 40,000 registered voters based on
administrative data, whereas only 13% of the responders to the production survey were that size.
In order to account for variable response rates across jurisdiction size, DMDC weighted the NRB data.
The data from the production survey was weighted to represent the entire population of jurisdictions.
To see if this data exhibits NRB, it was compared with composite estimates that incorporate responses
from both surveys. The composite estimates were the sum of the unweighted responses from the
production survey, which represent the population of respondents because this survey was a census,
and the responses from the NRB study weighted up to the population of non-respondents, which are all
jurisdictions that did not respond to the production survey:
Composite Estimate = Production Survey Un-weighted Total + Nonresponse Bias Weighted Estimate
This weighting was done using the same poststrata shown in Table 4. An example of the creation of a
composite estimate is shown in Table 7, while the comparisons to the production survey weighted data
are shown in Table 8.
Table 7.
Calculation Example for Composite Estimates
Question
Total participating
UOCAVA voters
Production
Survey Unweighted Total
Nonresponse Bias
Weighted Estimatea
Nonresponse Bias
Unweighted Estimate
375,243
20,790
Composite
Estimateb
258,606
633,849
a
Nonresponse Bias Weighted Estimate represents the population of non-respondents to the production survey. Because NRB data was not imputed, each
question was weighted separately to the full population of non-respondents.
b
Composite Estimate = Production Survey Unweighted Total + Nonresponse Bias Weighted Estimate.
Table 8.
Comparison of Production and Composite Estimates by Question
Production
Survey
Estimate
Question
Total registered UOCAVA votersa
Total participating UOCAVA votersa
Composite
Estimate
1,468,641
341,544
1,089,420b
597,490
237,067
633,849
a
193,661
19,059
180,337
a
10,176
2,506
8,238b
Regular UOCAVA absentee ballots returned
Regular UOCAVA absentee ballots rejected
Production
Survey Margin of
Error
Regular UOCAVA absentee ballots counteda
184,242
17,012
181,779
Federal Write-In Absentee Ballots returned by UOCAVA voters
a
6,784
734
19,431
Federal Write-In Absentee Ballots counted for UOCAVA voters
a
4,383
477
13,853
a
UOCAVA voters are voters who are covered by the Uniformed and Overseas Citizen Absentee Voting Act.
b
During the editing process, it was discovered that some jurisdictions provided values to these questions that were inaccurate (for example, the jurisdiction
claimed to have more UOCAVA voters than total registered voters). As a result, these jurisdictions were not included in the composite estimates.
7
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Of the seven questions listed in Table 8, four have composite estimates that are within the margin of
error of the production survey, which may indicate that NRB has a limited effect on these estimates.
These questions are total participating UOCAVA voters, regular UOCAVA absentee ballots returned,
regular UOCAVA absentee ballots rejected, and regular UOCAVA absentee ballots counted.
The composite estimates that are not within the margins of error of the production estimates illuminate
two possible concerns with the survey data. First, UOCAVA voters tend to be concentrated in certain
jurisdictions. The response propensity of these jurisdictions greatly affects the survey estimates. For
instance, the mean number of registered UOCAVA voters from the responses to the production survey
was 235, as opposed to 112 for the nonresponse bias study. However, measurement errors may also
have affected the NRB study data due to the delayed fielding period as well as potential
misinterpretation of questions. The production survey data editing reduced but did not eliminate
measurement problems (some clear measurement problems were discovered during further
assessments conducted for the NRB study).
The high concentration of UOCAVA voters in some jurisdictions would not pose a problem except for
the second problem: DMDC did not have access to a good correlate of UOCAVA registered voters.
The only variable available for all jurisdictions on our sampling frame was overall registered voters
(rather than UOCAVA), and the weak correlation between UOCAVA voters and all registered voters
limits the effectiveness of our stratification, weighting, and poststratification. The correlation constant
between these two variables is only 0.157, indicating that total registered voters is not a strong
predictor of total UOCAVA registered voters. If a better correlate exists, then jurisdictions can be
broken into groups based on this correlate for weighting. In other words, after weighting, jurisdictions
that responded and have large numbers of UOCAVA registered voters would represent all jurisdictions
with large numbers of UOCAVA voters. By controlling weights in this way, each non-respondent
jurisdiction can be represented by similar jurisdictions. However, the closest correlate available to
DMDC was the number of total registered voters. Therefore, when breaking jurisdictions into groups
for weighting, jurisdictions with large numbers of UOCAVA registered voters could not be kept in the
same group. As a result, the response propensity of large UOCAVA jurisdictions has a greater effect
on total estimates than desired.
Proposed Future Methodology
As DMDC prepares the 2012 Post-Election Voting Survey of Local Election Officials, it is important to
find a stronger correlate of UOCAVA registered voters, which is the focus of the survey. By doing so,
jurisdictions within each of the weighting groups would be more similar than in the 2010 survey.
Therefore, the margins of error should be smaller and NRB should be accounted for in a more
systematic way for all survey questions. Because UOCAVA registered voters are so concentrated in a
small number of jurisdictions, even knowing UOCAVA totals for only the highest UOCAVA
jurisdictions would be helpful, as very little variance comes from the rest of the population of
jurisdictions. DMDC plans to use a combination of historical data from FVAP’s survey of LEOs and
the EACs election survey to produce a ‘UOCAVA’ measure of size for the 2012 FVAP survey.
8
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Appendix A.
Production Survey Questionnaire
9
2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Studies
Appendix B.
Nonresponse Bias Study Questionnaire
10
2010 Post-Election Response Survey of
Local Election Officials
AGENCY DISCLOSURE NOTICE
PR
O
O
F
The public reporting burden for this collection of information is estimated to average 30 minutes per response,
sponse,
ponse, including the time
tim for reviewing
the instructions, searching existing data sources, gathering and maintaining the data needed, and completing
ompleting
mpleting and reviewing
reviewin the collection
of information. Send comments regarding this burden estimate or any other aspect of this collection
on
n of information, including suggestions
sugge
for reducing the burden, to the Department of Defense, Executive Services Directorate (0704-0125).
that
125).
5). Respondents should be aware
a
th
notwithstanding any other provision of law, no person shall be subject to any penalty for failing
ng to comply with a collection
collectio of information if it
does not display a currently valid OMB control number.
PRIVACY NOTICE
CE
This survey does not collect or use personally identifiable information
mation and
d is not retrieved by a personal identif
identifier. Therefore, the
information collected is not subject to the Privacy Act of 1974,, as amended
d (5 U.
U.S.C. § 552a).
This notice informs you of the purpose of the 2010 Post-Election
Election
lection Voting Surveys and how the findings of
o these surveys will be
used. Please read it carefully.
AUTHORITY: 42 United States Code, Section 1973ff.
PRINCIPAL PURPOSE: This survey is conducted
(FVAP), which informs and educates United
ed by the Federal Voting Assistance Program
Pr
(FVAP
States citizens covered by the Uniformed and Overseas
The UOCAVA covers members of the
as Citizens Absentee Voting Act (UOCAVA).
(U
(UOCAVA)
Uniformed Services and Merchant Marines, their familyy members, and citizens residing outside the United States. Reports will be provided to
the President and to Congress.
DISCLOSURE: Providing information
the survey in 30 minutes. There is no penalty
on on this survey is voluntary. Most people can complete
co
com
to you or your office if you choose
e not to respond. However, maximum participation is encouraged so that the data will be complete and
representative. Your individual survey
urvey responses will be kept private to the extent permitted by law. If you answer any items and indicate
distress or being upset, etc., you will not be contacted for follow-up purposes. H
However, if you indicate a direct threat to harm yourself or
others within responses orr communications about the survey, because of con
concern for your welfare, the Defense Manpower Data Center
(DMDC) will notify an office
appropriate action.
ffice
ice in your area for appro
SURVEY ELIGIBILITY
LITY AND POTENTIAL
TENTIAL BENEFITS: Local Election
Electio Official offices representing all voting jurisdictions including the District
of Columbia and
population. There is no direct benefit for your individual participation, however
d the
he U.S. territories are included in the survey pop
your responses,
es,
s, when taken together with the rresponses from all
a the other Local Election Officials, will make a difference by helping to
identify areas
improved.
eas
as where the absentee voting process can be impr
STATEMENT
to involve any risk or discomfort to you. The only risk is the unintentional
EMENT
MENT OF
O RISK: Completing the survey is not expected
ex
e
losure
osure of the data you provide. However, the gov
gove
disclosure
government and its contractors have a number of policies and procedures to ensure that
survey
ey data are safe and protected. Government and
a contractor staff have been trained to protect survey data.
NS ABOUT
BOUT TH
THIS SURVEY: If you have
h
any questions about this survey, please contact the Survey Processing Center by
QUESTIONS
sending an email to LEOSurvey@osd.penta
LEOSurvey@osd.pentagon.mil or call, toll-free, 1-800-881-5307. If you have concerns about your rights as a research
LEOSur
participant, please
ase contact:
ntact: Ms. Caro
Caroline Miner, Human Research Protection Program Manager for the Office of the Under Secretary of
Defense (P&R), HRPP@tma.osd.mil
575-2677.
@tma.osd.mil (703)
(7
(
Please do not complete
ete or mail
ma this survey.
1
VOTER REGISTRATION
1. Enter the total number of registered and eligible voters in your jurisdiction who were covered by the
Uniformed and Overseas Civilians Absentee Voting Act (UOCAVA) in the November 2010 general election.
Include both active and inactive UOCAVA voters.
a. Uniformed Service Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
Total .................................................................
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
PR
O
O
F
VOTER TURNOUT
p
in the
he November
Nove
2010
2. Enter the total number of UOCAVA voters in your jurisdiction who participated
general election. Please include all UOCAVA voters who cast
st absentee ballots, Federal Write-In Absentee
Ab
A
Ballots (FWABs), and special state ballots. Also include rejected
ejected ballots cast by UOCAVA voters
voter
vote only if
your jurisdiction credits the person’s vote history even
n though
ough the ballot was rejec
rejected.
a. Uniformed Service Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
.......
...
Total .................................................................
..................
.....
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
BALLOTS
RECEIPT OF REGULAR UOCAVA ABSENTEE
ABS
3. Did your jurisdiction
on receive
rec
any reg
regular absentee ballot
ballots from UOCAVA voters for the November 2010
general election?
n?
?
Yes
No GO TO QUESTION 7
on’t
n’t know
kn
GO TO QUESTION 7
Don’t
4. Enter
er the
e total number of regular absentee
ab
ballots returned by UOCAVA voters for the November 2010
general
al election.
ction. Exclude Federal
Federa Write-In Absentee Ballots (FWABs) from your totals.
Does not
ot apply;
ply; My jurisdiction
jurisd
jurisdicti did not track the number of regular absentee ballots that were returned by
UOCAVA voters.
s GO T
TO QUESTION 5
a. Uniformed Service
ce
eM
Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
Total .................................................................
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
2
REJECTION OF REGULAR UOCAVA ABSENTEE BALLOTS
5. Enter the total number of regular absentee ballots returned by UOCAVA voters that were rejected in your
jurisdiction for the November 2010 general election. Exclude Federal Write-In Absentee Ballots (FWABs)
from your totals.
Does not apply; My jurisdiction did not track the number of regular absentee ballots returned by
UOCAVA voters that were rejected. GO TO QUESTION 6
a. Uniformed Service Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
PR
O
O
F
Total .................................................................
Zero
COUNTED REGULAR UOCAVA ABSENTEE BALLOTS
6. Enter the total number of regular absentee ballots returned by UOCAVA voters that were counted
counte in your
jurisdiction for the November 2010 general election. Exclude
clude
ude Federal Write-In Absentee Ballots
Ballot (FWABs)
from your totals.
Does not apply; My jurisdiction did not track the number
mber off regular absentee ballots returned
re
by UOCAVA
voters that were counted. GO TO QUESTION 7
a. Uniformed Service Members (domestic
or overseas).....................................................
....
.....
................
....
b. Overseas Civilians ...........................................
Total .................................................................
...........................
........
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
FEDERAL
DERAL WRITE-IN ABSENTEE BALLOTS (FWABs)
7. Did your jurisdiction
ction
tion receive
receiv any
y Federal Write-In A
Abs
Absentee Ballots (FWABs) from UOCAVA voters for the
November 2010
010 general
al election?
Yes
No GO TO QUESTION 12
Don’t k
know GO
O TO QUESTION 12
8. Enter the total
otal number
num
of Federal Write-In Absentee Ballots (FWABs) returned by UOCAVA voters in your
jurisdiction
2010
general election.
on for
or the November
N
2
Does not apply;
y; My jurisdiction
jurisd
did not track the number of FWABs returned by UOCAVA voters. GO TO
QUESTION 12
a. Uniformed Service
e Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
Total .................................................................
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
3
9. Enter the total number of Federal Write-In Absentee Ballots (FWABs) that were rejected in your jurisdiction
for the November 2010 general election.
Does not apply; My jurisdiction did not track the number of FWABs returned by UOCAVA voters that were
rejected. GO TO QUESTION 11
a. Uniformed Service Members (domestic
or overseas).....................................................
b. Overseas Civilians ...........................................
Total .................................................................
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
PR
O
O
F
10. Of the total number of Federal Write-In Absentee Ballots (FWABs) returned
ned by UOCAVA voters that
tha were
rejected, how many were rejected due to your jurisdiction receiving UOCAVA
absentee
CAVA voter’s regular absente
a
ballots before your state’s statutory deadline?
Does not apply; My jurisdiction did not track how many FWABs were rejected because regular
regula absentee ballots
were received before the state’s statutory deadline. GO TO
O QUESTION 11
Reason for rejecting FWABs: Regular absentee ballot was
s received by jurisdiction before the statutory
deadline.
a. Uniformed Service Members (domestic
or overseas).....................................................
.......
...
b. Overseas Civilians ...........................................
Total .................................................................
..................
.....
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
11. Enter the total numberr of Federal Write-In Absentee Ballots (FWABs) that were counted in your jurisdiction
for the November 2010
010 general electi
election.
Does not apply;
ply;
y; My jurisdiction
jurisd
did n
not trackk the numb
number
um
of FWABs returned by UOCAVA voters that were
counted. GO TO QUESTION 12
a. Uniformed
med Service Members (dom
(domestic
or overseas)
verse .....................................................
................................................
........................................
...........................................
Civilians............................
..
b. Overseas Civilians
..................................
.........
Total .................................................................
Zero
,
,
Data not
available
Zero
,
,
Data not
available
Zero
,
,
Data not
available
4
2010 POST-ELECTION VOTING SURVEY
OF LOCAL ELECTION OFFICIALS
INTERNET ACCESS
15. Do you currently have Internet access in the
office area that you perform your election
official duties?
12. What was the main reason why you did not
participate in the 2010 Post-Election Voting
Survey of Local Election Officials? Mark one.
Yes
My participation was not legally mandated
No
I was too busy
Don’t know
The survey was too long
I had already completed other voting surveys
(e.g., the 2010 Election Assistance
Commission’s Election Administration and
Voting Survey)
FUTURE FEDERAL
AL VOTING
V
ASSISTANCE
PROGRAM
RAM (FVAP) SURVEYS
SU
I forgot to respond
16. For future
re Federal Voting Assistance
Assista
Program
Pr
(FVAP)
P) surveys, which
hich of the following survey
su
ethod
thod would you most prefer
pre
respon to?
method
to respond
PR
O
O
F
I do not remember being invited to participate in
the survey GO TO Q15
I don’t know
A mailed
ed survey
Some other reason (Please specify)
A Web survey
ey
Some other option
on
No preference
13. Did you receive any e-mail notifications
ons
ns a
about
the 2010 Post-Election Voting Survey
vey of Local
Election Officials?
TAKING THE SURVEY
17. Iff you
yo ha
have comments or concerns that you
wer not able to express in answering this
were
survey, please enter them in the space
su
sur
provided.
Yes
No
I don’t have an e-mail
mail
ail address
Don’t know
14. Did you
u receive any postal mai
mail notifications
about
Survey of
utt the 2010 Post-Election Voting
Voti
ocal
cal Election Officials?
Offic
Local
Yes
No
n’t know
ow
Don’t
5
Data Recognition Corp.-2G0352-11844-54321
File Type | application/pdf |
File Title | Report Title |
Author | Wetzeles |
File Modified | 2012-06-29 |
File Created | 2012-06-29 |