Statistical Sample Plan Telemedicine

Service Level Measurement - Telemedicine (001).docx

Clearance for A-11 Section 280 Improving Customer Experience Information Collection

Statistical Sample Plan Telemedicine

OMB: 2900-0876

Document [docx]
Download: docx | pdf

Service Level Measurements: Telehealth Survey

Sampling Methodology Report

Prepared by

Veteran Experience Office

Version 1 June 2018





Contents

Executive Summary

Telehealth is an effective and convenient way for patients to receive, and for clinicians to provide quality care management. The initiative is a way to incorporate telecommunications technology to improve and modernize health care offered by the Veteran Health Administration (VHA). In order to assess patient satisfaction with the program and identify areas for intervention or further evaluation, the Telehealth Services Office within VHA enlisted the services of the VEO. The Veteran Telehealth Survey is designed to measure Customer Experience associated with utilizing VA electronic health services within the three major aspects, or modalities, of Telehealth: Clinical Video Telehealth (CVT), Home Telehealth (HT), and Store and Forward (SFT). The purpose of this report is to document the survey methodology and sampling plan of the survey. Information about quality assurance protocols, as well as limitations of the survey methodology, is included in this report.

The study consists of a nationally representative survey of Veterans who utilized telehealth services within the past week. Beginning in July 2018, a weekly sample of such Veterans will be collected, in which they will be asked questions regarding their satisfaction specifically under a telehealth modality. Survey links are disseminated to patients undergoing care at different points (stages) in the health monitoring process, which differ by modality. The online questionnaire is brief and follows a human-centered design methodology focusing on Trust, Emotion, Effectiveness, and Ease with the care they received. The measurement questions are based on a five-point Likert scale (1-5) from Strongly Disagree (1) to Strongly Agree (5).

The overall sample size is determined based on a 95% confidence level and 3% margin of error with respect to monthly estimates. This is a compromise between obtaining ongoing cross-sectional estimates of suitable precision, while maintain low sampling rates over time. The selection process is a stratified design: controlling for geographic region (district), demographic, modality, and stage. Once data collection is completed, the participant responses in the online survey will be weighted so that the samples will be more representative of the overall population. Iterative proportional fitting to create sample weights will be applied using variables: Modality/Stage, Gender, Age Group (18-39, 40-59, 60+), and District.

Once the data is collected, it is immediately available in the Veteran Insight System. Survey weights are incorporated into the system at the close of every weekly survey. The interface allows data users to analyze the survey results using interactive charts and sub-populations. Survey data may also be reviewed over differing time periods, ranging from weekly, to monthly, to quarterly estimates.

Part I – Introduction

A. Background

The Enterprise Measurement and Program Improvement team (EM&PI) is part of the Insights and Analytics (I&A) division within the Veterans Experience Office (VEO). The EM&PI team is tasked with conducting transactional surveys of the Veteran population to measure their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services. Thus, their mission is to empower Veterans by rapidly and discreetly collecting feedback on their interactions with such VA entities as NCA, VHA, and VBA. VEO surveys generally entail probability samples which only contact minimal numbers of Veterans necessary to obtain reliable estimates. This information is subsequently used by internal stakeholders to monitor, evaluate, and improve beneficiary processes. Veterans are always able to decline participation, and have the ability to opt out of future invitations. A quarantine protocol is maintained to limit the number of times a Veteran may be contacted, in order to prevent survey fatigue, across all VEO surveys.

Surveys issued by EM&PI are generally brief in nature, and present a low amount of burden to Veterans. A few targeted questions will utilize a human centered design (HCD) methodology, revolving around concepts of Trust, Ease, Effectiveness and Emotion. Questions will focus on a specific aspect of a service process; spanning communication, applying for benefits, deliberation, and/or receipt of benefits. Structured questions directly address the pertinent issues regarding each surveyed line of business. The opportunity to volunteer open-ended text responses is provided within most surveys. This open text has been demonstrated to yield enormous information. Machine learning tools are used for text classification, ranking by sentiment scores, and screening for homelessness, depression, etc. Modern survey theory is used to create sample designs which are representative, statistically sound, and in accordance with OMB guidelines on federal surveys.

VA uses a wide variety of technologies to facilitate quality healthcare to its beneficiaries. Telehealth services are a critical aspect to modernizing the VA health care system. Telehealth (TH) increases access to high quality services by using information technology and telecommunication for Veterans, especially those that live in remote areas or are incapacitated. In FY 2017, over 700,000 patients received care via the three central telehealth modalities1. Clinical Video Telehealth (CVT) is the use of real-time interactive video conferencing to assess, treat and provide patient care remotely. Veterans may be linked to physicians from a local clinic or even from home, for over 50 clinical applications, ranging from primary care to numerous specialties (e.g. tele-dermatology). Home Telehealth (HT) is applied to high-risk Veterans with chronic disease requiring long-term care. Care management is augmented through such technologies as in-home and mobile monitoring, messaging, and/or video conferencing. The goal of HT is to reduce complications, hospitalizations, and clinical/ER visitations, so at-risk patients may remain in their own homes. Finally, Store and Forward Telehealth (SFT) concerns the acquisition and storage of electronic patient information (e.g., images, sounds, and video) collected at a VA clinic or medical center. The information is forwarded and retrieved by healthcare professionals at another VA medical facility where an assessment is performed.

The Veteran Experience Office (VEO) has been commissioned by the Veteran Health Administration (VHA) to measure the satisfaction of Telehealth recipients regarding their electronic interaction with physicians, nursing professionals, and other medical staff. It also seeks Veteran input on the quality of the treatment they received via the three modalities listed above. VEO proposes to conduct a brief transactional survey on Veterans who utilized the service within the past week. A subset of veterans will be randomly selected to participate. Sampled patients will be contacted through an invitation email. A link will be enclosed so the survey may be completed using an online interface, with customized patient information. The survey itself will consist of a handful of questions revolving around a human-centered design, focusing on such elements as trust, emotion, effective, and ease with the care they received.

B. Basic Definitions

Coverage

The percentage of the population of interest that is included in the sampling frame.

Measurement Error

The difference between the response coded and the true value of the characteristic being studied for a respondent.

Non-Response

Failure of some respondents in the sample to provide responses in the survey.

Transaction

A transaction refers to the specific time a Veteran interacts with the VA that impacts the Veteran’s journey and their perception of VA’s effectiveness in caring for Veterans.

Response Rate

The ratio of participating persons to the number of contacted persons. This is one of the basic indicators of survey quality.

Sample

In statistics, a data sample is a set of data collected and/or selected from a statistical population by a defined procedure.

Sampling Error

Error in estimation due to taking a particular sample instead of measuring every unit in the population.

Sampling Frame

A list, map, or other specification of units in the population from which a sample may be selected.

Reliability

The consistency or dependability of a measure. Also referred to as standard error.



C. Application to Veterans Affairs

In general, customer experience and satisfaction are usually measured at three levels: the enterprise level, the service level patterns, and point-of-service feedback. This measurement may bring insights and value to all stakeholders at VA. Front-line VA leaders can resolve individual feedback from Veterans and take steps to improve the customer experience; meanwhile VA executives can receive real-time updates on systematic trends that allow them to make changes.

1) To collect continuous customer experience data that make or break the service experience

2) To help field staff and the national office identify areas of improvement

3) To understand emerging drivers and detractors of customer experience.

Part II – Methodology

A. Target Population and Frame

The target population of the TH survey is all Veterans having an Outpatient CVT, HT, or SFT event in the past 7 days. The identification of Telehealth patients utilizes weekly data extracts from the Corporate Data Warehouse (CDW), which houses the operational records of VHA. Each Telehealth event eligible for a VEO survey will be associated with one of these three modalities. The classification of TH events into a modality is based on a combination of primary and/or secondary stop codes (see Appendix 1), as indicated by VSSC documentation. Under each modality, three types of surveys are designed to inquire about veterans’ experience in terms of different VA service domains (see Table 1).

Patients scheduling CVT appointments are derived from the general Outpatient scheduling database table from CDW. When either an actual CVT or SFT appointment occurs, the distinction between Offsite appointments (home, mobile, or non-VA facilities) vs. appointments taking place within a VAMC or CBOC is based on secondary stop codes. [Need to add data sources for SFT Feedback and Home Telehealth New Enrollments and Disenrollments]. A subset of veterans in each modality and subtype will be randomly selected to participate in the survey. However, the subtypes of [add information here] are sparsely populated so these will be selected into each weekly sample with certainty. Telehealth subtypes that are less than 10% of the modality population will be selected with certainty. In total, there will be 9 total sets of survey questions.

Table 1. Survey Types under Telehealth Modalities

Telehealth Modality

Subtype 1

Subtype 2

Subtype 3

Clinical Video Telehealth

Scheduled Appointment

Visited VAMC or CBOC

Offsite Appointment.

Store and Forward

Visited VAMC or CBOC

Offsite Appointment.

Received HCP Feedback

Home Telehealth

New Enrollment

Patient Monitoring

Disenrollment



  1. Sample Size Determination

For a given margin of error and confidence level, the sample size is calculated as below (Lohr, 1999):

For population that is large, the equation below is used to yield a representative sample for proportions:

where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.

  • Note that pq attains its maximum when value p=0.5, and this is sometimes used for a conservative sample size (i.e., large enough for any proportion).

  • e = the desired level of precision; in the current case, the margin of error e = 0.03, or 3%. Also referred to as MOE.



For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:

Where

  • = Representative sample for proportions when the population is large.

  • N = Population size.





The margin of error surrounding the baseline proportion is calculated as:

Where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • N = Population size.

  • n = Representative sample.

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.



Sample sizes are calibrated to ensure monthly reports have a 3% MOE at a 95% Confidence Level at the modality level for Veterans 18+. This represents an industry standard for reliability widely used by survey administrators (Lohr, 1999). In order to improve measurements at the facility-level, the sample sizes are marginally increased for the CVT modality, but are still kept low enough to prevent excessive repeated contacts. The CVT population may support an increased sampling rate (42%), but this is not the case for SFT, which already has a sampling rate above 50% (62%). Although the Home Telehealth rate is relatively low (21%), there is little month-to-month turnover within this modality: HT patients receive care on a more permanent basis. Therefore, it is reasonable to maintain a lower sampling rate since there will be repeated sampling over time.

Table 2 depicts the number of unique Telehealth patients that received care in fiscal year 2017, along with the approximate monthly populations. Preliminary analysis of the Telehealth patient population indicates that approximately 30% of such patients have provided an email address to the VHA. This represents the frame population for the survey (see section below for information on possible bias due to frame under-coverage). Table 2A shows the sample targets from each TH modality for the three reporting periods, while Table 2B provides the expected number of Veterans that need to be invited to achieve the sample targets, presuming a response rate of 20% (observed in the VEO Outpatient Survey)

Table 2. Target Population Figures

Survey Stratum

Unique Population in FY 2017


Approximate Monthly Population

Approximate Monthly Email Population

Precision at 3% MOE

Clinical Video

336,000

60,000

18,000

1,049

Home Telehealth

145,000

81,000

24,300

1,054

Store and Forward

306,000

27,000

8,100

1,027



Source: VA Telehealth Services Factsheet for FY18

Table 2A. Proposed Sample Targets by Time Period

Survey Stratum

Weekly Target

Monthly Target

Quarterly Target

Clinical Video

375

1,500

4,500

Home Telehealth

250

1,000

3,000

Store and Forward

250

1,000

3,000

Total

875

3,500

10,500





Table 2B. Proposed Number of Invited Telehealth Patients, by Time Period

Survey Stratum

Weekly Contacts

Monthly Contacts

Monthly Email Sampling Rate

Quarterly Contacts

Clinical Video

1,875

7,500

42%

22,500

Home Telehealth

1,250

5,000

21%

15,000

Store and Forward

1,250

5,000

62%

15,000

Total

4,375

17,500

N/A

52,500



  1. Stratification



As noted in the section above, stratification is employed to ensure that sufficient amounts of Veterans will be sampled within each of the three modalities. Additionally, a second layer of stratification is used to ensure ample representation for each of the three subtypes within modality. Thus there are two stratification variables from which explicit sample targets are derived: modality and subtype. These two stratification variables will be called explicit. The allocation of subtype within each modality is proportional to the underlying population (as opposed to the email population). In the event that subtype population is less than 10% of the modality population, then these telehealth patients are sampled with certainty.

To ensure samples are balanced with respect to the following demographic variables: Age Group, Gender, District and VAMC/CBOC, the random selection of patients within each stratum will follow a systematic sampling design. The Veterans are sorted according to the demographic variables, and every nth patient will be selected for survey invitation (the value of n will change randomly with each new survey iteration). This mechanism ensures that resulting respondent sample resembles the email population with respect to the demographic variables. Since these stratification variables do not have explicit targets for each permutation, they are deemed to be implicit stratification variables.

Although we do not expect differences between the email population and the general population with regard to geography, email populations tend to skew somewhat younger and more female. Since these groups are less represented in the Veteran population, it is not problematic for these demographics to be marginally oversampled – sample weighting calibrated to the general population will ensure valid representation and correct for any imbalances.

Stratification Type

Variables

Explicit

Modality, Subtype

Implicit

Age Group, Gender, District, VAMC, CBOC



  1. Data Collection Methods

At the beginning of every measurement period, VEO data analysts will access the Corporate Data Warehouse (CDW), which contains the governmental database for all VHA interactions. The telehealth target population will be extracted and recorded with each new iteration. Those veterans with a valid email address will be included in the survey frame. A new random sample, according to the stratification and quarantine protocol defined below will be used to create an invitation file. Emails are immediately delivered to all selected patients. Selected respondents will be contacted within 7 days of their Telehealth interaction. They will have 14 days to complete the survey. Estimates will be accessible to data users instantly, with the final weighted results available 14 days after the beginning of the survey.

Table 3. Survey Mode

Mode of Data Collection


Recruitment Method

Time After Transaction

Recruitment Period

Collection Days

Online Survey

Email Recruitment

Within 7 days after Telehealth Appointment

14 Days

(Reminder after 7 Days)

Friday



  1. Reporting

Researchers will be able to use the Veteran Insight System (powered by Medallia) for interactive reporting and data visualization. VA employees with a PIV card may access the system at https://va.voice.medallia.com/sso/va/. Trust, Ease, Effectiveness, and Emotion scores can be observed for each Modality and Subtype (or Survey Type). The scores may viewed by Age Group, Gender, and Race/Ethnicity in various charts for different perspective. They are also depicted within time series plots to investigate trends. Finally, filter options are available to assess scores at varying time periods and within the context of other collected variable information.

Recruitment is continuous (weekly) but the results from several weeks may be combined into a monthly estimate for more precise estimates, which is the recommended reporting level. Weekly estimates are unweighted, but allow analysts to review scores more quickly and within smaller time intervals. Weekly estimates are less reliable for small domains, and should only be considered for aggregated populations. Monthly estimates will have larger sample sizes, and therefore higher reliability set to a 3% MOE at the 95% Confidence level (at the Modality Level for Veterans 18+). Monthly estimates are also weighted for improved representation and less bias (non-response and coverage, see section G on Sample Weighting). Quarterly estimates are the most precise, but will take the greatest amount of time to obtain (12 weeks of collection). However, Quarterly estimates are the most suitable for the analysis of small populations (e.g. VAMC, Female Veterans 18-29, etc.).

  1. Quality Control

To ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps are as follows.

  1. Records will be reviewed for missing sampling and weighting variable data. When records with missing data are discovered, they will be either excluded from the population file or put into separate strata upon discussion with subject matter experts.

  2. Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the double sampling of the same veteran.

  3. Invalid emails will be removed.

The survey sample loading and administration processes will have quality control measures built into them.

  1. The extracted sample will be reviewed for representativeness. A secondary review will be applied to the final respondent sample.

  2. The survey load process will be rigorously tested prior to the induction of the TH Survey to ensure that sampled customers is not inadvertently dropped or sent multiple emails.

  3. The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.

The weighting and data management quality control checks are as follows:

  1. The sum of the weighted respondents will be compared to the overall population count to confirm that the records are being properly weighted. When the sum does not match the population count, weighting classes will be collapsed to correct this issue.

  2. The unequal weighting effect will be used to identify potential issues in the weighting process. Large unequal weighting effects indicate a problem with the weighting classes, such as a record receiving a large weight to compensate for nonresponse or coverage bias.

  1. Sample Weighting, Coverage Bias, and Non-Response Bias

A final respondent sample should closely resemble the true population, in terms of the demographic distributions (e.g. age groups). One problem that arises in the survey collection process is nonresponse, which is defined as failure of selected persons in the sample to provide responses. This occurs in various degrees to all surveys, but the resulting estimates can be distorted when some groups are actually more or less prone to complete the survey. In many applications, younger people are less likely to participate than older persons. Another problem is under-coverage, which is the event that certain groups of interest in the population are not even included in the sampling frame. They cannot participate because they cannot be contacted: those without an email address will be excluded from sample frame. These two phenomena may cause some groups to be over- or under-represented. In such cases, when the respondent population does not match the true population, conclusions drawn from the survey data may not be reliable, and are said to be biased.

Survey practitioners recommend the use of sampling weighting to improve inference on the population. This will be introduced into the survey process as a tool that helps the respondent sample more closely represent the overall population. Weighting adjustments are commonly applied in surveys to correct for nonresponse bias and coverage bias. As a business rule will be implemented to require callers to provide email address, the coverage bias for Survey 2 is expected to be small. In many surveys, however, differential response rates may be observed across age groups. In the event that some age groups are more represented in the final respondent sample, the weighting application will yield somewhat smaller weights for this age group. Conversely, age groups that are underrepresented will receive larger weights. This phenomenon is termed non-response bias correction for a single variable. Strictly speaking, we can never know how non-respondents would have really answered the question, but the aforementioned adjustment calibrates the sample to resemble the full population – from the perspective of demographics. This may result in a substantial correction in the resulting weighting survey estimates when compared to direct estimates in the presence of non-negligible sample error (non-response bias).

Because the email population will have different demographics than the overall population, the initial sample will be selected in a manner from the frame so that the final respondent sample resembles the overall population. Stratification may also adjust for non-response (occurring when certain subpopulations are less prone to participate). Targets will be established for every permutation of the following stratification variables. As such, population values will be collected and recorded by VEO for every data collection period.

Stratification Variables

Modality (CVT, HT, SFT)

Subtype of Survey Modality

Gender (M, F)

Age Group (18-39, 40-59, 60+)

District



The stratification scheme above will result in a representative sample (w.r.t to the full population). Weighting will then be applied so that the sample is more fully matched to the population. Sample weights will be generated for Monthly, and Quarterly estimates.

It was reported earlier that the email population comprises 30% of the full Telehealth population. Since 85% of older Americans utilize email (Choi & Dinitto, 2013), we can presume that most veterans choose not to share their email address with VHA or are simply unware of that option. It is assumed that the level of patient satisfaction is not directly related to their email status (Missing at Random). Since age and gender have been observed to be strong predictors of patient satisfaction in other VA health surveys, the stratification and weighting methodology outlined above will adequately compensate for any bias introduced by the incomplete frame of population.

Raking or Iterative Proportional Fitting (IPF) will be used as the method for the sampling weighting. The IPF weighting is a mathematical scaling method, which is used to ensure that the sample data is adjusted so that its marginal (row and column) totals agree with constraining marginal totals from the population data (Battaglia et al., 2009; Kalton & Flores-Cervantes, 2003; Kolenikov, 2014). Therefore, the response probabilities in IPF depend on the row and column and not on the particular cell (Lohr, 1999). IPF acts as an iterative process of recalculating the weights so that the original sample values are gradually adjusted through repeated calculations to fit the marginal constrains from the population. The starting weights correspond to the inverse of the probability of selection multiple by a non-response adjustment factor.The final sample data is a joint probability distribution of maximum likelihood estimates obtained when the probabilities are convergent within an acceptable (pre-defined) level of tolerance. The completion of IPF depends on the convergence of the algorithm that requires the cell estimates are not zero.

The mathematical definitions of IPF are indicated below (Wong, 1992):



Where

  • is the matrix element in row i, column j, and iteration k.

  • are the pre-defined row totals and column totals respectively.



Equations (1) and (2) are employed iteratively to estimate new cell values and will theoretically stop at iteration m where:



As part of the weighting validation process, the weights of persons in age and gender groups are summed and verified that they match the universe estimates (i.e., population totals). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is calculated as:

where

  • cv = coefficient of variation for all weights .

  • s = sample standard deviation of weights.

  • = sample mean of weights, ij.

  1. Quarantine Rules

VEO seeks to limit contact with Veterans as much as possible, and only as needed to achieve measurement goals. These rules are enacted to prevent excessive recruitment attempts upon Telehealth patients. VEO also monitors veteran participation within other surveys, to ensure veterans do not experience survey fatigue. All VEO surveys offer options for respondents to opt out, and ensure they are no longer contacted for a specific survey. VEO also monitors Veteran participation within other surveys, to ensure Veterans do not experience survey fatigue. Finally, all VEO surveys offer options for respondents to opt out, and ensure they are no longer contacted for a specific survey.

Table 4. Quarantine Protocol

Quarantine Rule

Description

Elapsed Time

Repeated Sampling for Telehealth Survey

Number of days between receiving/completing online survey, prior to receiving email invitation for a separate Telehealth experience

60 Days

Other VEO Surveys

Number of days between receiving/completing online survey and becoming eligible for another VEO survey

30 Days

Prioritization

Prioritization is based on the observed sample sizes.

N/A

Opt Outs

Persons indicating their wish to opt out of either phone or online survey will no longer be contacted.

N/A



Part III – Assumptions and Limitations

A. Repeated Surveys

There are currently existing Telehealth Surveys that are administered electronically to patients through means other than email. This double-surveying may reduce the expected 20% response rate for the VEO Telehealth Survey, due to survey fatigue of Telehealth patients. The response rate will be monitored, and the number of patients contacted may be increased to maintain the final respondent targets.

B. Coverage Bias

Since the VEO Telehealth Survey is email only, there is a large population VHA patients that cannot be reached by the survey. Veterans that lack access to the internet or do not use email may have different levels of Trust and satisfaction with their service. However, the majority of Veterans that do not share their email addresses do so because they did not have an opportunity to provide the information, or they elected not to share their email address. As such, it is thought that Veterans in this latter category do not harbor any tangible differences to other Veterans who do share their information. In order to verify this, VEO plans to execute a coverage bias study to assess the amount of coverage bias due, and derive adjustment factors in the presence of non-negligible bias.

C. Other Issues

The telehealth service may have limited use to the diagnosis and treatment of common illnesses and conditions. Veterans who have complex disease types, such as cancer or tumor, may not choose to use telehealth to pursuit the medical care even if they are located in the remote area. The telehealth service users do not cover Veterans with a wide spectrum of diseases. Therefore, the Veteran respondent types should be incorporated into consideration when interpreting the survey results and applications.

The telehealth service rating may require Veterans to be familiar with and have access to modern technologies (e.g., Apps, Mobil Appt, Online Video Chat). Therefore, Veterans who use the telehealth services and respond to the survey may be younger in age. The demographic distribution of the survey respondents will be reviewed by the VEO when receiving the survey results.

Home Telehealth is designed to provide medical care and services to high-risk Veterans with chronic disease. When such patients receive the survey, their family members, caregivers, or nurses are likely to respond to the survey on behalf of them. Therefore, the feedback and information from the primary source may be missing. VEO will continue to identify these responses in the VA databases and assess the effect of them on the Telehealth Survey estimates.

Part IV - Appendices

Appendix 1. References

Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research, 15(5), e97

Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official Statistics, 19(2), 81-97.

Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.

Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting (Raking). The Stata Journal, 14(1): 22–59.

Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.

Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified sampling. Proceedings of the American Statistical Association’s Section on Survey Research Methods.

Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting Procedure. The Professional Geographer, 44 (3), 1992, pp. 340-348

1 VA Telehealth Services Fact Sheet FY17, Office of Connected Care, VHA, VA

10

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJacobsen, Michael
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy