Download:
pdf |
pdf2008 Post-Election Voting Survey
of Overseas Citizens
Statistical Methodology Report
Additional copies of this report may be obtained from:
Defense Technical Information Center
ATTN: DTIC-BRR
8725 John J. Kingman Rd., Suite #0944
Ft. Belvoir, VA 22060-6218
Or from:
http://www.dtic.mil/dtic/order.html
Ask for report by ADA504033
DMDC Report No. 2009-050
August 2009
2008 POST-ELECTION VOTING SURVEY OF
OVERSEAS CITIZENS:
STATISTICAL METHODOLOGY REPORT
Defense Manpower Data Center
Human Resources Strategic Assessment Program
1600 Wilson Boulevard, Suite 400, Arlington, VA 22209-2593
Acknowledgments
Defense Manpower Data Center (DMDC) is indebted to numerous people for their
assistance with the 2008 Post-Election Voting Survey of Federal Overseas Citizens, which was
conducted on behalf of the Office of the Under Secretary of Defense for Personnel and
Readiness (OUSD[P&R]). The survey program is conducted under the leadership of Timothy
Elig, Director, Human Resources Strategic Assessment Program (HRSAP).
Policy officials contributing to the development of this survey include Erin St. Pierre and
Scott Wiedmann (Federal Voting Assistance Program). Other important contributors to the
survey development include Elizabeth Gracon (Department of State), and Mike Wilson (Westat).
DMDC’s Program Evaluation Branch, under the guidance of Brian Lappin, previous
Branch Chief, and Kristin Williams, current Branch Chief, is responsible for the development of
questionnaires in the survey program. The lead survey design analyst was Robert Tinney.
DMDC’s Personnel Survey Branch, under the guidance of David McGrath, Branch Chief,
is responsible for survey sampling methods, survey database construction, and archiving. The
lead operations analyst on this survey was Laverne Wright, supported by Ryan Murphy,
Consortium Research Fellow. The lead statistician on this survey was Mark Gorsak. Westat
performed data collection and editing.
DMDC’s Survey Technology Branch, under the guidance of Frederick Licari, Branch
Chief, is responsible for the distribution of datasets outside of DMDC and maintaining records
on compliance with the Privacy Act and 32 CFR 219.
ii
2008 POST-ELECTION VOTING SURVEY OF
OVERSEAS CITIZENS:
STATISTICAL METHODOLOGY REPORT
Executive Summary
The Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA), 42
USC 1973ff, permits members of the Uniformed Services and Merchant Marine, and their
eligible family members and all citizens residing outside the United States who are absent from
the United States and its territories to vote in the general election for federal offices. These
groups include:
•
Members of the Uniformed Services (including Army, Navy, Air Force, Marine
Corps, Coast Guard)
•
U.S. citizens employed by the Federal Government residing outside the U.S., and
•
All other private U.S. citizens residing outside the U.S.
The Federal Voting Assistance Program (FVAP), under the guidance of USD(P&R), is
charged with implementing the UOCAVA and evaluating the effectiveness of its programs. The
FVAP Office asked DMDC to design, administer, and analyze post-election surveys on
Uniformed Services voter participation, overseas nonmilitary voter participation, and local
election officials. Without such surveys, the Department will not be able to assess and improve
voter access. In addition, such surveys fulfill 1988 Executive Order 12642 that names the
Secretary of Defense as the “Presidential designee” for administering the UOCAVA and requires
surveys to evaluate the effectiveness of the program in presidential election years.
The objectives of the 2008 post-election surveys are: (1) to gauge participation in the
electoral process by citizens covered by UOCAVA, (2) to assess the impact of the FVAP’s
efforts to simplify and ease the process of voting absentee, (3) to evaluate other progress made to
facilitate voting participation, and (4) to identify any remaining obstacles to voting by these
citizens. Surveys were done of military members, federal civilian employees overseas, other
U.S. citizens overseas, voting assistance personnel, and local election officials in the U.S.
This report focuses on the 2008 Post-Election Voting Survey of Overseas Citizens (2008
OAC), which was designed to capture the attitudes and behaviors of overseas American citizens.
This report describes the sampling and weighting methodologies used in the 2008 OAC.
Calculation of response rates is described in the final section.
The population of interest for the 2008 OAC consisted of all American citizens living
overseas, excluding federal civilian employees and members of the U.S. Armed Forces.
The frame used to approximate the population of interest was compiled by the
Department of State from embassy and consulate registration records. From this frame, a sample
iii
was drawn of 10,687 individuals. The survey administration period lasted from November 7,
2008, to January 30, 2009. There were 577 usable questionnaires.
Due to the low response rate and number of usable questionnaires, there was no
weighting process. Point estimates from the survey only represent respondents and cannot be
weighted to generalize to the population of overseas citizens.
Observed location, completion, and response rates are provided in the final section of this
report for both the full sample and for population subgroups. These rates were computed
according to the recommendations of the Council of American Survey Research Organizations
(1982) and the American Association for Public Opinion Research (AAPOR, 2008). The
observed location, completion, and response rates were 36%, 15%, and 5%.
iv
Table of Contents
Page
Introduction......................................................................................................................................1
Sample Design and Selection.....................................................................................................2
Target Population.................................................................................................................2
Sampling Frame ...................................................................................................................2
Sample Design .....................................................................................................................3
Survey Allocation ................................................................................................................4
Sample Selection..................................................................................................................5
Survey Administration ...............................................................................................................6
Sample Contact Information ................................................................................................6
Survey Administration .........................................................................................................6
Web Survey Administration ................................................................................................7
Mail Survey Administration ................................................................................................7
Survey Administration Issues ....................................................................................................8
Mail Delivery Issue..............................................................................................................8
Other Delivery Issues...........................................................................................................9
Weighting...................................................................................................................................9
Case Dispositions.................................................................................................................9
Eligible Completed Cases ..................................................................................................11
Variance Estimation...........................................................................................................11
Location, Completion, and Response Rates ............................................................................11
Ineligibility Rate ................................................................................................................13
Estimated Ineligible Postal Non-Deliverable/Not Located Rate .......................................13
Estimated Ineligible Nonresponse .....................................................................................13
Adjusted Location Rate......................................................................................................13
Adjusted Completion Rate.................................................................................................13
Adjusted Response Rate ....................................................................................................13
References......................................................................................................................................15
List of Tables
1.
2.
3.
4.
5.
6.
7.
8.
Number of Embassies and Consulates by Region and Embassy Size .................................3
Number of Embassies in the Sample by Region and Embassy Size....................................3
Registrant Population Counts by Region and Embassy Size...............................................4
Sample Counts of Registrants by Region and Embassy Size ..............................................6
E-Mail Distribution to Overseas Citizens ............................................................................7
Case Disposition Resolution ..............................................................................................10
Sample Size by Case Disposition Categories ....................................................................11
Complete Eligible Cases by Region ..................................................................................11
v
9.
10.
Disposition Codes for Response Rates ..............................................................................12
Observed Rates by Region.................................................................................................14
vi
2008 POST-ELECTION VOTING SURVEY OF
OVERSEAS CITIZENS:
STATISTICAL METHODOLOGY REPORT
Introduction
The Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA), 42
USC 1973ff, permits members of the Uniformed Services and Merchant Marine, and their
eligible family members and all citizens residing outside the United States who are absent from
the United States and its territories to vote in the general election for federal offices. These
groups include:
•
Members of the Uniformed Services (including Army, Navy, Air Force, Marine
Corps, Coast Guard)
•
U.S. citizens employed by the Federal Government residing outside the U.S., and
•
All other private U.S. citizens residing outside the U.S.
The Federal Voting Assistance Program (FVAP), under the guidance of USD(P&R), is
charged with implementing the UOCAVA and evaluating the effectiveness of its programs. The
FVAP Office asked DMDC to design, administer, and analyze post-election surveys on
Uniformed Services voter participation, overseas nonmilitary voter participation, and local
election officials. Without such surveys, the Department will not be able to assess and improve
voter access. In addition, such surveys fulfill 1988 Executive Order 12642 that names the
Secretary of Defense as the “Presidential designee” for administering the UOCAVA and requires
surveys to evaluate the effectiveness of the program in presidential election years.
The objectives of the 2008 post-election surveys are: (1) to gauge participation in the
electoral process by citizens covered by UOCAVA, (2) to assess the impact of the FVAP’s
efforts to simplify and ease the process of voting absentee, (3) to evaluate other progress made to
facilitate voting participation, and (4) to identify any remaining obstacles to voting by these
citizens. Surveys were done of military members, federal civilian employees overseas, other
U.S. citizens overseas, voting assistance personnel, and local election officials in the U.S.
This report describes sampling and weighting methodologies for the 2008 Post-Election
Voting Survey of Overseas Citizens (2008 OAC). The first section of this report discusses the
target population, sample frame, design, and sample selection procedures. The second section
summarizes data collection procedures and the third presents survey case disposition
assignments. The fourth section describes weighting and variance estimation. The final section
describes the calculation of response rates, location rates, and completion rates for the full
sample and for population subgroups. Tabulated results of the survey are reported by DMDC
(2009).
1
Sample Design and Selection
Target Population
The 2008 OAC was designed to represent adult American citizens residing outside the
United States, excluding federal civilian employees and members of the Armed Forces.
Sampling Frame
The sampling frame used for the 2008 OAC was compiled by the Department of State
(DoS). It was assembled from embassy and consulate registration records of citizens living or
traveling overseas who voluntarily register either online through the Internet Based Registration
System or in-person with a U.S. embassy or consulate in the overseas country. Registration
records are generally used only to reach Americans in case of an emergency or to notify them of
a security threat against U.S. citizens in the country where they are living.
DoS registration records had the following limitations as a sampling frame for the survey:
•
Registration records have limited coverage of overseas citizens because many
overseas citizens do not register.
•
Registration coverage varies by geographic region (e.g., relatively complete coverage
in Africa and far less complete coverage in European countries or Canada).
•
Registration records include persons not eligible for the 2008 OAC survey. For
example, they include the names of minors, non-citizens, persons no longer living
overseas who did not notify the embassy that they had returned to the United States,
and persons living in the United States who want to receive emergency messages
because they have a child or family member traveling abroad.
•
Registration records contain different types of contact information. Some contain
only postal addresses. E-mail addresses are available for only a subset of registrants
because the DoS did not begin collecting that information for U.S. citizen registrants
until 2000.
Despite these known limitations, the registration lists were considered the best source for
building a sample frame and drawing a probability sample of overseas adult Americans. The
frame, then, is said to include (but is not restricted to) overseas adult American citizens who
registered at an embassy or consulate. The number of embassies and consulates by geographic
region and embassy size are shown in Table 1. This classification by size and region defines the
stratification used for sample selection. The size of an embassy is defined by the number of
citizens registered at an embassy. An embassy is classified as small if it has 5,000 or fewer
registered citizens. A medium-sized embassy has between 5,001 and 20,000 registered citizens.
Large embassies have between 20,001 and 50,000 registered citizens, and a very large embassy
has over 50,000 registered citizens.
2
Table 1.
Number of Embassies and Consulates by Region and Embassy Size
Embassy Size
Total
Africa
230
124
66
23
17
44
36
7
1
0
Total
Small
Medium
Large
Very Large
Region
East Asia/
Europe
Pacific
38
59
21
30
7
15
6
8
4
6
NE and SC Western
Asia
Hemisphere
38
51
23
14
11
26
2
6
2
5
Registration information collected by the DoS (i.e., name and address) is protected by the
Privacy Act. As a result, all processing of information, frame assembly, sample selection, and
survey mailing was performed exclusively by the DoS.
Sample Design
The 2008 OAC sample used a stratified two-stage design. In the first stage, embassies
were stratified by the five geographic regions and four embassy size groups forming the twenty
sample strata. Within each sample stratum, embassies were sorted alphabetically by country and
city name. From this sorted list, within each stratum, a systematic random sample of embassies
was selected. The number of embassies in the sample by region and embassy size is shown in
Table 2.
Table 2.
Number of Embassies in the Sample by Region and Embassy Size
Embassy Size
Total
Small
Medium
Large
Very Large
Total
Africa
70
29
21
12
8
12
7
4
1
0
Region
East Asia/
Europe
Pacific
14
17
6
6
3
4
3
4
2
3
3
NE and SC Western
Asia
Hemisphere
12
15
6
4
4
6
1
3
1
2
After the selection of the embassies, the second stage of sampling was implemented. In
this stage, the lists of registered citizens for each selected embassy within a sampling stratum
were separately sorted by name. The registrant population counts by region and embassy size
are shown in Table 3.
Within each stratum, individuals were selected with equal probability without
replacement using systematic random sampling. Because the allocation of the sample was not
proportional to the size of strata, selection probabilities varied among strata, and individuals
were not selected with equal probability overall. Nonproportional allocation was used to achieve
adequate sample sizes for relatively small subpopulations of analytic interest. The key domain
of interest was geographic region.
Table 3.
Registrant Population Counts by Region and Embassy Size
Embassy Size
Total
Small
Medium
Large
Very Large
Total
Africa
884,321
29,510
166,516
210,547
477,748
31,781
4,614
20,608
6,559
0
Region
East Asia/
NE and
Western
Europe
Pacific
SC Asia Hemisphere
195,120 261,461 160,265
235,694
10,046
4,637
5,267
4,946
21,870
38,385
29,094
56,559
56,833
85,599
7,083
54,473
106,371 132,840 118,821
119,716
Survey Allocation
The total sample size was based upon precision requirements for the key reporting
domain, geographic region. The precision goal was a confidence interval of ±5 percentage
points at the 95% precision level. To achieve this level of precision, a total of approximately 400
completed surveys were needed for each geographic region. As there was no past history to
draw on in setting assumptions regarding location and cooperation for this population, the
research team considered several possible response scenarios. The team used an eligibility rate
for adult U.S. citizens of 60%. Further, it was assumed that, of the eligible persons on the lists,
60% would have e-mail addresses and 40% would have postal addresses. It was also assumed
that the e-mail access rate would be 80%, whereas the postal access rate would be 60%. Finally,
the cooperation rate overall would be 45%. These assumptions yielded a sample size of 2,100
for each geographic region to produce a total of 408 completed surveys. Worldwide, the total
sample was set at 10,500.
For each geographic region the sample size was set at 2,100 for an equal size allocation
across regions. Within each region, an allocation strategy was needed for selection by embassy
size. As Table 3 shows, there was a predictable difference in population counts by embassy size
4
with very large embassy populations 20 or more times greater than that for embassies classified
as small. If a proportionate-to-size allocation strategy were adopted, there would be very little
survey input from embassies classified as small. Conversely, if an equal allocation were
implemented across embassy size strata (e.g., 525 per strata or 700 per strata for Africa), a
considerable design effect would result as the selection probabilities would vary greatly.
As a compromise between equal and proportionate allocation, a square root allocation
was used. Under this allocation, the sample is allocated to the subpopulations proportional to the
square root of the size of the subpopulation. Under the square root allocation, the sample is
reallocated from the very large embassies to the smaller embassies as compared to what would
have been done under a proportionate allocation.
This can be put in context when compared to a more general compromise allocation - the
power allocation - under which the sample is allocated proportional to xλ , where x is the measure
of size and the parameter λ can take values between zero and 1. The value λ =
1
corresponds to
2
the square root allocation. The two extreme values of λ give the equal allocation and the
proportionate-to-size allocation. More precisely, λ = 0 corresponds to equal allocation and λ = 1
corresponds to proportionate-to-size allocation.
Because of the issues mentioned for the equal and proportionate allocations, the square
root allocation strategy is particularly well suited for the 2008 OAC. Specifically, if we let n
denote the total sample size, and n h be the sample allocated to stratum g ,then n h the sample
allocated to stratum g is computed as
nh = n
∑
Nh
Nh
,
h
where N h is the total number of persons in stratum h .
Sample Selection
Sample selection proceeded in two stages. In the first stage, the frame of all 230
embassies was stratified by geographic region and embassy size into 20 sample strata as shown
in Table 1. From each of the twenty sample strata, a systematic random sample of embassies
was selected to achieve a total sample size of 70 embassies (Table 2). In the second stage of
selection, a sample was drawn from the population of registrants in each sample stratum.
Registrant population counts are shown in Table 3. Sample allocation among the strata within a
geographic region was made using the square root strategy. Sample selection was accomplished
using the systematic random method. Sample counts of registrants by region and embassy size
are shown in Table 4.
5
Table 4.
Sample Counts of Registrants by Region and Embassy Size
Region
Embassy Size
Total
Small
Medium
Large
Very Large
Total
Africa
10,687
1,299
2,965
2,853
3,570
2,118
493
1,044
581
0
East Asia/
Pacific
2,090
256
381
610
843
Europe
2,281
157
446
847
831
NE and
Western
SC Asia Hemisphere
2,099
2,099
227
166
532
562
263
552
1,077
819
Survey Administration
Sample Contact Information
Survey administration for the 2008 OAC began on November 7, 2008, and continued
through January 30, 2009. The survey was administered mixed mode–in both Web and paper
formats. Sample members with an e-mail address were initially assigned to the Web survey
(2,651), whereas sample members with only a postal address (7,734) were initially assigned to
the paper survey. An additional 302 cases were determined to be ineligible (e.g., a minor, in the
United States, a noncitizen, or deceased) before the start of data collection.
The DoS could not share protected contact information (e.g., registrant names, postal
addresses, and e-mail addresses) with other members of the research team. Consequently, all
labeling and mailing operations (postal and e-mail) were performed at the DoS offices in
Washington, DC.
Survey Administration
For both the Web and paper administration, the data collection plan called for three types
of communication with sampled American citizens: pre-notification, survey invitation, and
thank you/reminder. The pre-notification would alert sampled individuals that they had been
selected for participation in the survey and provide background on the purpose and sponsor of
the survey. The second communication, the “survey invitation,” would contain the paper survey
for postal recipients or a link to the survey for web recipients. Finally, the third type of
communication would be a “thank you/reminder.” After a specified period following survey
invitation/distribution, the “thank you/reminder” would be sent. The main purpose of this
communication was to remind sampled individuals of the survey and ask them to complete and
return the survey.
6
Web Survey Administration
The DoS sent e-mail pre-notifications, survey invitations, and thank you/reminders to
members of the survey sample with known e-mail addresses under the signature of Janice L.
Jacobs, Assistant Secretary of State for Consular Affairs. The dates of the e-mail distribution are
shown in Table 5.
The survey invitations and thank you/reminders included a hyperlink to the survey Web
site and a unique Ticket Number for logging on to the survey. Thank you/reminders were sent to
all sample members excluding those who had been identified as ineligible or whose earlier thank
you/reminder had bounced back. The pre-notifications did not include the ticket number or Web
site. Please see DMDC (In preparation) for further information on survey administration.
Table 5.
E-Mail Distribution to Overseas Citizens
Type of E-Mail
Pre-notification
Survey invitation
Thank you/reminder:
Thank you/reminder:
Thank you/reminder:
Thank you/reminder:
Date
10/31/08
11/7/08
11/14/08
12/4/08
12/15/08
1/5/09
Mail Survey Administration
The pre-notification letter, as well as the survey cover letter and the thank you/reminder
letters, were all sent under the signature of Janice L. Jacobs, Assistant Secretary of State for
Consular Affairs. Printed pre-notification letters in franked envelopes were delivered to the
DoS, where labels with names and addresses were attached to the mailing envelopes. During the
labeling and assembly of the letters, the DoS removed several hundred letters with bad addresses.
Thus, pre-notification letters were sent to approximately 7,100 of the original 7,734 overseas
citizens assigned to the paper survey. After the letters were assembled, they were mailed via the
United States Postal Service (USPS) from October 30, to October 31, 2008.
In early November, the DoS received survey invitation packet materials. These included
a survey cover letter, the printed survey, an envelope with a DoS address for returning the
completed survey, and an outer mailing envelope. Packet materials were assembled and labeled
between November 7, and November 10, 2008. Once assembled, the expectation was that the
packets would be picked up and processed by the USPS.
7
Thank you/reminder letters and envelopes were assembled and labeled at the DoS and
mailed via USPS from November 19, to November 21, 2008. No other thank you/reminders
were mailed.
Return postage was not affixed to the outer mailing envelope. Sample members had
three options for returning their completed paper surveys. The first option was to return the
survey personally to their embassy or consulate. The second was to mail the survey to their
embassy or consulate using their own postage. Once an embassy or consulate received a survey,
the survey was sent via diplomatic pouch to DoS. The DoS delivered them in batches via FedEx
to Westat for data entry, cleaning, and processing. The third option was for the respondent to
supply their own postage on the return envelope and mail it directly to the DoS in Washington,
DC. The DoS then delivered the survey to Westat.
Survey Administration Issues
Mail Delivery Issue
The planned survey invitation mailing experienced difficulties. Contrary to the plan of
using USPS, employees in the DoS mail room re-sorted the survey invitation packets and placed
them in diplomatic pouches for delivery overseas. This change in procedure was not
communicated to the DoS staff working on the 2008 OAC. As a consequence, survey packets
arriving at an embassy were stuck upon arrival. Having U.S. postage, they could not be mailed
or delivered without having local postage added to the packet.
The mailing problem was discovered when the DoS learned, near the end of November,
that some overseas citizens had received the pre-notification letter and a thank you/reminder
letter but not the survey packet. The DoS began tracking the location of the surveys and
confirmed in early December that the surveys had all been sent via diplomatic pouches rather
than by the USPS. The DoS then undertook a thorough effort to document survey status by post.
Its findings are summarized below.
•
Nineteen posts reported they mailed or delivered the surveys locally to sample
members, another post delivered some of them locally, and surveys for sample
members in Canada were re-sent from New York. Another post reported it received
the surveys, but there was no information about what was done with them.
•
Nine posts sent the undelivered surveys back to DoS Headquarters in Washington,
DC or notified the DoS that they were en route. In seven of those instances, DoS remailed the surveys to sample members; however, some were re-mailed late in the data
collection period to two posts—on January 22, 2009, to Venezuela and on December
29, 2008, to Mexico.
•
The surveys were either not seen by, or no answer about the surveys was received
from 34 posts. However, completed surveys or undeliverable surveys were received
from 18 of those posts, indicating that some effort was made to deliver surveys to
sample members registered with those posts.
8
Other Delivery Issues
Other postal mail issues include the following:
•
Pre-notification letters mailed in October were returned to the DoS several months
after the close of the survey and marked “return to sender.”
•
The postal reminders did not include the web site and ticket number.
For the Web administration, there were browser connectivity issues with the Web
address. Respondents sent e-mail messages to the Westat help center. Alternative methods to
link to the site were provided to the respondents.
Weighting
Due to the low response rate and number of usable questionnaires, there was no
weighting process. Point estimates from the survey only represent respondents and cannot be
weighted to generalize to the population of overseas citizens.
Case Dispositions
Case dispositions were assigned based on eligibility for the survey and completion of the
returned survey.
Final case dispositions were determined using information from field operations (the
Survey Control System, or SCS) and returned surveys. No single source of information is both
complete and correct; inconsistencies among these sources were resolved according to the order
of precedence shown in Table 6.
9
Table 6.
Case Disposition Resolution
Case Disposition
Frame ineligible
Ineligible by self- or
proxy-report
Survey response:
ineligible
Eligible,
complete response
Information Source
Conditions
Personnel record Ineligible on the list.
Ill, Incarcerated, or Deceased.
SCS
Survey Questionnaire Respondent is not a US citizen or is less than 18 years old.
Item response is at least 50% for respondents that were registered
voters. All respondents identified as not registered were eligible and
complete for the survey.
Eligible,
Item response rate Return is not blank but item response is less than 50% for
incomplete response
registered voters.
Unknown eligibility, Personnel record, first Incomplete personnel record AND first survey item is missing AND
and complete
survey question, item item response is at least 50%.
response
response rate
Active refusal
Reason for refusal is “any;” ineligible reason is “other;” reason
SCS
survey is blank is “refused-too long,” “ineligible-other,” “unreachable
at this address,” “refused by current resident,” or “concerned about
security/confidentiality.”
PND
Postal non-delivery or original non-locatable.
SCS
Nonrespondent
Remainder
Remainder
Item response rate
This order is critical to resolving case dispositions. For example, suppose a sample
person refused the survey, with the reason that it was too long; in the absence of any other
information, the disposition would be “eligible nonrespondent.” If a proxy report was also given
that the sample person had been hospitalized and was unable to complete the survey, the
disposition would be “ineligible.”
Final case dispositions for the 2008 OAC are shown in Table 7.
10
Table 7.
Sample Size by Case Disposition Categories
Case Disposition
Category and (Code Value)
Total
Frame Ineligible
Ineligible Response
Self/Proxy-report (2)
Survey Self report (3)
Eligible Response
Complete (4)
Incomplete (5)
Unknown eligibility
Refused/Deployed/Other (8)
Postal Non-Delivery (10)
Non-respondents (11)
Sample
Size
10,687
6
861
121
577
34
5
3
2343
6737
Eligible Completed Cases
Table 8 shows the number of respondents by region.
Table 8.
Complete Eligible Cases by Region
Total
Total
577
Africa
99
Region
East Asia/
Europe
Pacific
134
174
NE and SC Western
Asia
Hemisphere
105
65
Variance Estimation
Due to the low number of usable questionnaires, no variance estimation procedures were
performed, such as creation of variance strata.
Location, Completion, and Response Rates
Location, completion, and response rates were calculated in accordance with guidelines
established by the Council of American Survey Research Organizations (CASRO). The
11
procedure is based on recommendations for Sample Type II response rates (Council of American
Survey Research Organizations, 1982). This definition corresponds to The American
Association for Public Opinion Research (AAPOR) RR3 (AAPOR, 2008), which estimates the
proportion of eligible cases among cases of unknown eligibility.
Location, completion, and response rates were computed for PEVSVAO08 as follows:
The location rate (LR) is defined as
LR =
adjusted located sample N L
.
=
adjusted eligible sample N E
The completion rate (CR) is defined as
CR =
usable responses
N
= R.
adjusted located sample N L
The response rate (RR) is defined as
RR =
usable responses
N
= R.
adjusted eligible sample N E
where
•
NL = Adjusted located sample
•
NE = Adjusted eligible sample
•
NR = Usable responses.
To identify the cases that contribute to the components of LR, CR, and RR, the
disposition codes were grouped as shown in Table 9.
Table 9.
Disposition Codes for Response Rates
Case Disposition Category
Eligible Sample
Located Sample
Eligible Response
No Return
Eligibility Determined
Self Report Ineligible
Code Value
4, 5, 8, 10, 11
4, 5, 8, 11
4
11
2, 3, 4, 5, 8
2, 3
Note. Code values are from Table 7.
12
Ineligibility Rate
The ineligibility rate (IR) is defined as
IR =
self report ineligible cases
.
eligible determined cases
Estimated Ineligible Postal Non-Deliverable/Not Located Rate
The estimated ineligible postal non-deliverable / not located rate (IPNDR) is defined as
IPNDR = (Eligible Sample − Located Sample ) * IR.
Estimated Ineligible Nonresponse
The estimated ineligible nonresponse (EINR) is defined as
EINR = ( Not returned ) * IR.
Adjusted Location Rate
The adjusted location rate (ALR) is defined as
ALR =
( Located Sample − EINR )
.
( Eligible Sample − IPNDR − EINR )
Adjusted Completion Rate
The adjusted completion rate (ACR) is defined as
ACR =
( Eligible response )
.
( Located Sample − EINR )
Adjusted Response Rate
The adjusted response rate (ARR) is defined as
ARR =
( Eligible response )
.
( Eligible Sample − IPNDR − EINR )
Observed location, completion, and response rates by region for 2008 OAC are shown in
Table 10.
13
Table 10.
Observed Rates by Region
Domain
Sample
Region
Africa
East Asia/Pacific
Europe
NE and SC Asia
Western Hemisphere
Sample Usable
Location
Size Responses Rate (%)
10687
577
35.8
2118
2090
2281
2099
2099
99
134
174
105
65
22.8
41.6
43.3
37.9
37.1
14
Completion
Rate (%)
14.8
20.3
15.2
17.3
12.8
8.3
Response
Rate (%)
5.4
4.7
6.4
7.6
5.0
3.1
References
American Association for Public Opinion Research. (2008). Standard definitions: Final
dispositions of case codes and outcome rates for surveys. 5th edition, Lenexa, KS: AAPOR.
Council of American Survey Research Organizations. (1982). On the definition of response
rates (special report of the CASRO task force on completion rates, Lester R Frankel, Chair).
Port Jefferson, NY: Author.
DMDC. (In preparation). 2008 Post-Election Voting Survey of Overseas Citizens:
Administration, datasets, and codebook (Report No. 2009-048). Arlington, VA: Author.
DMDC. (2009). 2008 Post-Election Voting Survey of Overseas Citizens: Tabulations of
responses (Report No. 2009-047). Arlington, VA: Author.
15
)RUP$SSURYHG
5(3257'2&80(17$7,213$*(
20%1R
7KH SXEOLF UHSRUWLQJ EXUGHQ IRU WKLV FROOHFWLRQ RI LQIRUPDWLRQ LV HVWLPDWHG WR DYHUDJH KRXU SHU UHVSRQVH LQFOXGLQJ WKH WLPH IRU UHYLHZLQJ LQVWUXFWLRQV VHDUFKLQJ H[LVWLQJ GDWD VRXUFHV
JDWKHULQJDQGPDLQWDLQLQJWKHGDWDQHHGHGDQGFRPSOHWLQJDQGUHYLHZLQJWKHFROOHFWLRQRILQIRUPDWLRQ6HQGFRPPHQWVUHJDUGLQJWKLVEXUGHQHVWLPDWHRUDQ\RWKHUDVSHFWRIWKLVFROOHFWLRQ
RI LQIRUPDWLRQ LQFOXGLQJ VXJJHVWLRQV IRU UHGXFLQJ WKH EXUGHQ WR 'HSDUWPHQW RI 'HIHQVH :DVKLQJWRQ +HDGTXDUWHUV 6HUYLFHV 'LUHFWRUDWH IRU ,QIRUPDWLRQ 2SHUDWLRQV DQG 5HSRUWV
-HIIHUVRQ'DYLV+LJKZD\6XLWH$UOLQJWRQ9$5HVSRQGHQWVVKRXOGEHDZDUHWKDWQRWZLWKVWDQGLQJDQ\RWKHUSURYLVLRQRIODZQRSHUVRQVKDOOEH
VXEMHFWWRDQ\SHQDOW\IRUIDLOLQJWRFRPSO\ZLWKDFROOHFWLRQRILQIRUPDWLRQLILWGRHVQRWGLVSOD\DFXUUHQWO\YDOLG20%FRQWUROQXPEHU
3/($6('21275(7851<285)250727+($%29($''5(66
5(3257'$7(''00<<<<
5(32577<3(
14-08-09
'$7(6&29(5(')URP7R
October-December 2008
Final Report
D&2175$&7180%(5
7,7/($1'68%7,7/(
2008 Post-Election Survey of Overseas Citizens - Statistical Methodology
Report
E*5$17180%(5
F352*5$0(/(0(17180%(5
G352-(&7180%(5
$87+256
Defense Manpower Data Center (DMDC)
H7$6.180%(5
I:25.81,7180%(5
3(5)250,1*25*$1,=$7,21
3(5)250,1*25*$1,=$7,211$0(6$1'$''5(66(6
5(3257180%(5
Defense Manpower Data Center
1600 Wilson Boulevard, Suite 400
Arlington, VA 22209-2593
DMDC Report 2009-050
6321625021,725
6$&521<06
6321625,1*021,725,1*$*(1&<1$0(6$1'$''5(66(6
6321625021,725
65(3257
180%(56
',675,%87,21$9$,/$%,/,7<67$7(0(17
Approved for Public Release; distribution unlimited
6833/(0(17$5<127(6
$%675$&7
This report describes sample design, sample selection, weighting, and variance estimation procedures for the 2008 Post-Election
Voting Survey of Overseas Citizens. The first section of this report describes the design and selection of sample. The second section
provides information on weighting and variance estimation. The final section describes the calculation of response rates, location
rates, and completion rates for the full sample and for population subgroups.
68%-(&77(506
Weighting, Response Rates, Sampling Design, Estimation Procedures, Variance Estimation, UOCAVA, Voting, and Overseas
Citizens
6(&85,7<&/$66,),&$7,212)
D5(3257
U
E$%675$&7
U
F7+,63$*(
U
/,0,7$7,212)
180%(5 D1$0(2)5(63216,%/(3(5621
$%675$&7
2)
3$*(6
UU
28
David E. McGrath
E7(/(3+21(180%(5,QFOXGHDUHDFRGH
(703) 696-2675
6WDQGDUG)RUP5HY
3UHVFULEHGE\$16,6WG=
,16758&7,216)25&203/(7,1*6)
5(3257'$7()XOOSXEOLFDWLRQGDWHLQFOXGLQJ
GD\PRQWKLIDYDLODEOH0XVWFLWHDWOHDVWWKH\HDU
DQGEHFile Type | application/pdf |
File Title | Microsoft Word - FVAP_Ready_Final_OC_Statistical_Methods_report_20090831.doc |
Author | MaloneJM |
File Modified | 2010-10-27 |
File Created | 2009-08-31 |