SS_Part B_final_1-26-09

SS_Part B_final_1-26-09.doc

Youth Advice and Feedback to Inform Choose Respect Implementation

OMB: 0920-0816

Document [doc]
Download: doc | pdf

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


The proposed research will not employ statistical methods to sample respondents for either the focus groups or the online surveys. This section justifies the decision not to use statistical sampling and analysis for the data collection methods that the project will employ.


The project’s proposed data collection procedures are described below.


The methods are:

  • Focus groups composed of 12 or fewer respondents per group.

  • Online surveys of 200 respondents per survey.



The anticipated data collections will be small in scale because they are intended to inform an iterative process of developing a health communication campaign, not to be generalized to a specified respondent universe. These audience-specific methods rely not on statistical power, but on the theoretical premise that language is interpreted through shared cultural knowledge and frameworks (Glaser and Strauss, 1967). To increase the likelihood that a message will be noticed, to avoid miscommunication, and to guard against insensitivity in specialized communication to sub-cultural groups, the proposed data gathering techniques provide “…a ’window’ on a particular worldview” (Priest, 1996).



By incorporating qualitative and quantitative elements in various mixtures, these methods allow the flexibility for in-depth probing; can be feasible in time-sensitive situations; and have worked well in campaign development for commercial advertising (Glaser and Strauss, 1967).


B. 1. Respondent Universe and Sampling Methods


The primary target audience for the Choose Respect campaign is youth ages 11 to 14.


For the in-person focus groups, the sample will be drawn from youth in or near three target markets selected based on geographic diversity and access to the target audience. The markets have not yet been determined, but for budgeting purposes we have assumed that the focus groups will be held in Atlanta, Georgia; Dallas, Texas; and San Francisco, California. These three markets were tentatively selected because they are geographically dispersed and have been used in prior Choose Respect research. Participants for the in-person focus groups will be recruited from registries of potential adult participants that are owned and maintained by the focus group facilities. These potential participants have indicated to the facility that they are interested in being offered opportunities to participate in focus groups. Trained staff at the focus group facilities will contact adults in their database to determine whether they have children in the appropriate age ranges who can participate in the groups. All groups will be recruited for a mix of family income and ethnicity. They will be segmented by gender, age (e.g., 11 to 12 year olds, 12 to 13 year olds, and 13 to 14 year olds), and residential setting (e.g., urban, suburban). Recruitment for each focus group will continue until the targeted number of participants has been achieved.


For the online survey research, the sample will be drawn from youth whose parents are members of Harris Interactive’s existing database of over 6 million adults (the Harris Poll Online panel) who have expressed interest in participating in online research. Harris Interactive rigorously recruits for and maintains this database of participants to represent demographic characteristics comparable to the U.S. population. One hundred percent of the database participants have confirmed through a two-step process that they want to be part of the database and to be offered opportunities to participate in online research (Harris, 2008).


For both the in-person focus groups and online surveys, the project will use convenience sampling to select participants, and all youth will be recruited through their parents. Because we will not be using a statistical method for sampling the respondents, participants and their responses to study questions will not necessarily be representative of the full universe of youth ages 11 to 14. However, because these data will be collected for program improvement purposes only, a non-sampling approach is appropriate for our needs.


Based on past experience conducting online research, we expect that the response rate for the online surveys will be somewhere between 5 percent and 50 percent. Because we will not be using a statistical method to sample and recruit participants, this relatively low response rate will not affect the accuracy or usefulness of the data obtained. An examination of response rates for the in-person focus group research is not relevant given that, as a research method, focus groups are not intended to function as a representative sample of a larger population, but rather to provide insights about the range of ways the target audience perceives a situation or tactic (Krueger, 1988; Stewart and Shamdasani, 1990). That said, based on past experience, we expect at least 10 of the 12 participants we recruit for each focus group to show up for the group.


B. 2. Procedures for the Collection of Information


Focus Groups


For the in-person focus groups, Ogilvy will work with professional facilities to recruit participants using conventional recruitment methods and a “screener instrument” (Stewart and Shamdasani, 1990; NCI, 2002) (See Attachment M, “Focus Group Screening Instrument for Parents and Youth.”). All participants will be recruited through their parents to allow the project to collect parental permission in addition to youth assent. The facilities will identify adults who may be parents of potential youth participants through their existing databases, which they own and maintain.  Adults in these databases have indicated to the facility that they are interested in being offered opportunities to participate in focus groups.


A screener instrument is a questionnaire that has been designed for recruiters to use to identify qualifying participants during the course of a brief telephone conversation, or, in the case of online surveys, completion of a brief online questionnaire. “Screeners” are carefully structured so that the questioning process is short, easy to-understand, friendly, and efficient. (See Attachment M.). Screeners for the in-person groups will be administered to parents of potential participants during the course of short telephone calls, during which recruiters will explain that youth participants will be compensated for their participation in the focus group. As described in Section A.9, incentives will vary slightly across groups, based on local cost of living differences. The amount of compensation will be roughly $75 - $100 per person.


The recruiters will administer a short screener to the parent over the telephone to determine whether they have any children of the appropriate age and background to participate in a focus group. If the child meets the criteria for participating in a focus group (in terms of their gender, age, etc.), the recruiter will briefly explain the purpose of the project and request the parents’ verbal permission to speak with their child over the telephone. The recruiter will then administer a short screener to the child to confirm his or her eligibility and ability to express himself or herself in a focus group discussion. The recruiter will then invite the child to participate in a focus group discussion and provide the parent with information about the date and location of the group, as well as other logistical details.


Upon successful recruitment, the parent or guardian of each participating youth will receive a confirmation letter from the facility. The letter will contain the logistical information (e.g., address and directions to the focus group facility, reminder about the date and time) and an informed permission form for the parent (See Attachment K, “Focus Group Parental Permission Form.”). Each parent or guardian will be required to return the signed permission form prior to their youth’s participation in the focus group. Upon arriving to the focus group facility, the youth also will be read and asked to sign an assent form prior to the start of the focus group (See Attachment L, “Focus Group Youth Assent Form.”).


After assenting to participate in the study, the respondents will be asked to complete a brief written survey while they wait in the waiting room for the focus group to start (See Attachment N, “Focus Group Survey.”). Once all surveys have been completed, the respondents in each group will meet in a room with a trained moderator and a one-way mirror, behind which CDC and Ogilvy staff will sit. The moderator will explain the study, inform the group of taping and observation, and lead a discussion using the moderator’s guide (See Attachment D, “Focus Group Moderator’s Guide.). Responses will be collected by audiotape, and the observers will take notes. Following each focus group, the tapes will be transcribed for qualitative analysis by Ogilvy staff. The tapes and observer notes will be destroyed once the final focus group report has been submitted to CDC.

Online Surveys


The youth online survey respondents will be recruited through their parents as well, allowing the project to collect parental permission in addition to youth assent, as well as protect the privacy of the youth participant.


Harris Interactive has an existing database of adults (the Harris Poll Online panel) who have expressed an interest in participating in online research. Harris maintains basic demographic information about the members of the panel, including presence and age of children in the household. Harris will select a random sample of adults who have children in the household. Each parent will receive an email from Harris Interactive (see Appendix E) explaining the general topic of the survey and containing a password protected link to a secure Web site for the survey. The password-protected link will be uniquely assigned to the parent’s email address to ensure that a respondent only completes the survey one time.


After clicking on the link, parents will be directed to the parent screener (Appendix F), which will be used to determine whether the adults have children living in their households who qualify for the study. Parents also will be provided with information about the purpose of the survey and an opportunity to either provide or decline parental permission for their children to participate.

If a parent permission s for their child to participate, and if the child is determined to be eligible based on the parent’s responses to the screener questions, the parent will be directed to either bring their child to the computer at that time to complete the youth screener and the survey, or given instructions on how to have their child resume at a later time.


The child will then complete a short screener requesting their grade, age, and gender to confirm their qualification and that they are the child for whom the parent provided permission. The child then will be provided with a brief description of the project and asked for their assent to participate. Upon obtaining assent, the Web site will direct the youth to another page within the secure site to complete the survey. The child will not be able to return to the parent portion of the survey. Once youth have completed a survey, they will be able to see how their responses to select questions compare to the aggregate of responses to the survey. Please see Appendix H for the online youth screener and Appendix I for the youth assent script. Appendix C contains the sample survey.


B. 3. Methods to Maximize Response Rates and Deal with Nonresponse


To encourage participation, focus group meetings will be held in locations that are convenient and easily accessible by public transportation, and where parking also is safe and easy. The group discussions will be held in clean, safe, and comfortable environments. In addition, the letter that the parents will receive a few days following the initial telephone recruitment call will serve as a reminder about the focus group. Based on past experience, we expect at least 10 of the 12 youth recruited for each group to show up, providing at least an 80 percent response rate.


For the online surveys, we expect a response rate of between 5 percent and 50 percent, based on past experience administering similar surveys. Because these data will be collected for program improvement purposes only, and the project is not using a statistical method for selecting participants, this relatively low response rate will not affect the usefulness or the accuracy of the data collected. To improve the response rate, one reminder invitation will be emailed two days after the initial invitation to those parents whose children have not yet completed the survey.


B. 4. Tests of Procedures or Methods to be Undertaken


The online survey (See Attachment C, “Online Survey.”) and focus group moderator guide (See Attachment D, “Focus Group Moderator’s Guide.”) were developed using standard focus group discussion and online survey design procedures. The nature and framing of the questions are consistent with those that successfully have been posed among youth audiences on behalf of other national health communication initiatives, including the CDC’s VERB campaign, a national, multicultural, social marketing initiative to increase and maintain physical activity among youth ages 9 to 13; and the Office of National Drug Control Policy’s National Youth Anti-Drug campaign, a national initiative to keep youth drug-free, which targets 9 to 18 year olds. In addition, several of the online survey questions (questions 1 and 6-13) were used in the 1997 household survey of U.S. youth, YouthStyles. These questions had an average response rate of 96 percent, indicating that youth did not have difficulty understanding or answering them.


The moderator guides and survey questions have been thoroughly reviewed by CDC and Ogilvy staff, as well as by our research partners. In addition, the questions will be pretested internally, using no more than nine individuals to estimate the length of time it will take to complete the questions, as well as to identify any questions that are confusing or difficult to answer.


B. 5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


CDC Staff:


Marie Boyle, MS

770-488-2040

Mle4@cdc.gov


The persons who assisted with designing the data collection and who will analyze the data are:


Ogilvy Staff:


Jennifer Wayman, M.H.S.

202-729-4161

Jennifer.wayman@ogilvypr.com


Michael Briggs

202-729-4198

Michael.briggs@ogilvypr.com


Margo Gillman, M.P.H.

202-729-4192

Margo.gillman@ogilvypr.com


Nancy Accetta, M.H.S., CHES

202-729-4167

Nancy.accetta@ogilvypr.com


Jennifer Scott, Ph.D.

212-880-5260

Jennifer.scott@ogilvypr.com



Harris Interactive:


Annette Abell, M.B.A.

585-214-7386

aabell@harrisinteractive.com


Dana Markow, Ph.D.

212-212-9676

dmarkow@harrisinteractive.com


Independent Focus Group Moderators:


Pat Marzi, M.B.A.

610-683-7762

pmarzi@ptd.net


Sonya Schroeder

925-658-2212

sonyak123@yahoo.com


REFERENCES


Andreasen A. Marketing social change: changing behavior to promote health, social development, and the environment. San Francisco: Jossey-Bass; 1995.


Asbury LD, Wong FL, Price SM, Nolin MJ. 2008. The VERB™ campaign: applying a branding strategy in public health. American Journal of Preventive Medicine 34 (6): S183-S187.



Berlin, M., Mohadjer, L., Waksberg, J., Kolstad, A., Kirsch, I., Rock, D., & Yamamoto, K. (1992). An experiment in monetary incentives. In the American Statistical Association (ed.), Proceedings of the American Statistical Association Section on Survey Research Methods (pp. 393-398). Alexandria, VA: American Statistical Association.


Black, DR, Blue, CL, & Coster, DC. (2001) Using social marketing to develop and test tailored messages. American Journal of Health Behavior, 25(3): 260-271.


Berlin, M, Mohadjer, L, Waksberg, J, Kolstad, A, Kirsch, I, Rock, D, & Yamamoto, K. (1992). An Experiment in Monetary Incentives. In the American Statistical Association (ed.), Proceedings of the American Statistical Association Section on Survey Research Methods (pp. 393-398). Alexandria, VA: American Statistical Association.


Black, MC, Noonan, R, Legg, M, Eaton, D, & Breiding, MJ 2006. Physical dating

violence among high school students--United States, 2003. MMWR Weekly, 55, 532-535.


Bowman, RL, & Morgan, HM. (1998). A comparison of rates of verbal and physical abuse on campus by gender and sexual orientation. College Student Journal, 32, 43-52.


Campbell, J. (2002). Health consequences of intimate partner violence. The Lancet, 359,

1331-1336.


Cano, A, Avery-Leaf, S, Cascardi, M, & O’Leary, DK. (1998). Dating violence in two

high school samples: Discriminating variables. Journal of Primary Prevention, 18, 431-446.


Carlin, DB, & McKinney, MS. (Eds.). (1994). The 1992 presidential debates in focus. Westport, CT: Praeger.


Carver, K, Joyner, K, & Udry, JR. (2003). National estimates of adolescent romantic

relationships. In P. Florsheim (Ed.), Adolescent romantic relations and sexual behavior: Theory, research, and practical implications. Mahwah, NJ: Erlbaum.


Centers for Disease Control and Prevention. 2003. Costs of intimate partner violence against women in the United States. Atlanta, GA. Available online at http://www.cdc.gov/ncipc/pub-res/ipv_cost/ipv.htm [Accessed on February 13, 2008].



Coreil J, Bryant CA, Henderson JN. 2000. Social and Behavioral Foundations of Public Health. Thousand Oaks, CA: Sage



Church, A.H. (1993). Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta Analysis. Public Opinion Quarterly, 57, 62 79.



Creswell, JW. (2003). Research Design. Qualitative, Quantitative, and Mixed Methods Approaches. Second Edition. Thousand Oaks: Sage Publications.



Creswell, JW, Plano Clark, VL, Gutmann, ML, & Hanson, WE. (2003). Advanced mixed methods esearch designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social & behavioral research (pp. 209-240). Thousand Oaks, CA: Sage.



Delli Carpini, MX, & Williams, BA. (1994). Methods, metaphors, and media research: The uses of television in political conversation. Communication Research, 21, 782-812.



Feiring, C. (1996). Lovers as friends: Developing conscious views of romance in adolescence. Journal of Research on Adolescence, 7, 214-224.



Findlay, J.S., & Shaible, W. L. (1980). A Study of the Effect of Increased Remuneration on Response in a Health and Nutrition Examination Survey. In the American Statistical Association (ed.), Proceedings of the American Statistical Association Section on Survey Research Method. (pp. 590-594). Washington, D.C.: American Statistical Association.



Foshee VA, Bauman KE, Ennett ST, Suchindran C, Benefield T, Linder FG. 2005. Assessing the effects of the dating violence prevention program “Safe Dates” using random coefficient regression modeling. Prevention Science 6 (3): 245–58.


Foshee, V, et al. 1996. Gender differences in adolescent dating abuse: prevalence, types and injuries. Health Education Research: Theory & Practice 11(3): 275-286.


Glaser, BG, & Strauss, AL. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Publishing Company.



Grier, S, & Bryant, CA. (2005). Social marketing in public health.

Annual Review of Public Health, 26, 319-339.



Harris Interactive. (2008). What To Look for When Considering Online Research: The Fundamentals. New York: Harris Interactive.



Heckathorn, DD. (2002). "Respondent-Driven Sampling II: Deriving Valid Estimates from Chain-Referral Samples of Hidden Populations". Social Problems 49: 11-34.



Heckathorn, DD. (1997). "Respondent-Driven Sampling: A New Approach to the Study of Hidden Populations". Social Problems 44: 174-199.



Kotler P, Roberto N, Lee N. 2002. Social marketing: improving the quality of life. Thousand Oaks (CA): Sage.



Krueger, RA. (1988). Focus Groups: A Practical Guide for Applied Research. Thousand Oaks, CA: Sage Publications.


Krueger, RA. (1994). Focus Groups: A Practical Guide for Applied Research. 2nd ed. Thousand Oaks, CA: Sage Publications.


Kulka, R. A. (1994) The Use of Incentives to Survey “Hard-to-Reach” Respondents: A Brief Review of Empirical Research and Current Practice. Paper prepared for the Council of Professional Associations on Federal Statistics’ Seminar on New Directions in Statistical Methodology. Bethesda, MD.



Lenhart A, Madden M, Hitlin P. 2005. Teens and Technology: Youth Are Leading the Transition to a Fully Wired and Mobile Nation. Washington. Pew Internet & American Life Project. Available online at http://www.pewinternet.org/pdfs/PIP_Teens_Tech_July2005web.pdf [Accessed on February 25, 2008].



Lenhart A, Madden M, Macgill AR, Smith A. 2007. Teens and Social Media: The use of social media gains a greater foothold in teen life as they embrace the conversational nature of interactive online media. Washington. Pew Internet & American Life Project. Available online at http://www.pewinternet.org/pdfs/PIP_Teens_Social_Media_Final.pdf [Accessed on April 29, 2008].


Lenhart A, Madden M. 2007. Social Networking Websites and Teens: An Overview. Washington. Pew Internet & American Life Project. Available online at http://www.pewinternet.org/pdfs/PIP_SNS_Data_Memo_Jan_2007.pdf [Accessed on February 25, 2008].


Magdol, L, Moffitt, TE, Caspi, A, & Silva, P. (1998). Developmental antecedents of

partner abuse: A prospective-longitudinal study. Journal of Abnormal Psychology, 107, 375-389.


Morgan, DL. (1988). Focus Groups As Qualitative Research. Newbury Park: Sage Publications.


Olsen S. 2007. Kids say email is like sooo dead. CNET.com. Available online at http://www.news.com/Kids-say-email-is,-like,-soooo-dead/2009-1032_3-6197242.html?tag=nefd.lede [Accessed May 1, 2008].

O’Keefe, M. 2005. Teen dating violence: a review of risk factors and prevention efforts. National Resource Center on Domestic Violence (2).


O’Leary, KD, & Slep, AS. (2003). A dyadic longitudinal model of adolescent dating

aggression. Journal of Clinical Child and Adolescent Psychology, 32, 314-327.


Priest, SH. (1996) Doing Media Research: An Introduction. Thousand Oaks California: Sage Press



Rennison, CM, Welchans, S. (2000). Intimate partner violence. U.S. Department of Justice, Bureau of Justice Statistics Special Report. Retrieved September 16, 2006 from http://www.ojp.usdoj.gov/bjs/pub/pdf/ipv.pdf



Salganik, MJ & Heckathorn, DD. (2004). "Sampling and Estimation in Hidden Populations Using Respondent-Driven Sampling". Sociological Methodology 34: 193-239.



Silverman, JG, Raj, A, Mucci, L, Hathaway, J. 2001. Dating violence against adolescent girls and associated substance use, unhealthy weight control, sexual risk behavior, pregnancy, and suicidality. Journal American Medical Association 286 (5): 572–9.

Silverman, JG & Williamson, GM. (1997) Social Ecology and Entitlements Involved in Battering by Heterosexual College Males: Contributions of Family and Peers. Violence and Victims, 12(2): 147-164.



Singer, E., Gebler, N., Raghunathan, T., VanHoewyk, J., & McGonagle, K. (in press). The Effect of Incentives on Response Rates in Face-to-Face, Telephone, and Mixed Mode Surveys. Journal of Official Statistics.



Smith, PH, White, JW, Holland, LJ. 2003. A longitudinal perspective on dating violence among adolescent and college-age women. American Journal of Public Health 93 (7): 1104–9.


Stewart, DW, Shamdasani, PN. (1990). Focus Groups: Theory and Practice. Volume 20. Newbury Park: Sage Publications.



Sugarman, DB & Hotaling, GT. (1989) Dating violence: Prevalence, context, and risk markers. In M. Pirog-Good and J. Stets (Eds.), Violence in dating relationships: Emerging social issues (pp. 3-32). New York: Praeger.



Plichta, SB. Violence and abuse: implications for women’s health. In: Falik MM, Collins KS, editors. Women’s health: the commonwealth survey. Baltimore (MD): Johns Hopkins University Press; 1996.


Taylor, H. (2007). The case for publishing (some) online polls. The Polling Report, 23, 1.



Teen Research Unlimited (TRU). 2008. The TRU Study 2008: U.S. Teen Addition.



U.S. Department of Health and Human Services, National Institutes of Health, National Cancer Institute. 1989. Making Health Communication Programs Work. Washington (DC): Available online at http://www.cancer.gov/pinkbook/page1 [Accessed on July 28, 2008].


U.S. Department of Labor, Bureau of Labor Statistics. 2008. Employment, hours, and earnings from the Current Employment Statistics survey (national). Washington, DC. Available online at http://data.bls.gov/PDQ/servlet/SurveyOutputServlet?request_action=wh&graph_name=CE_cesbref3 [Accessed on May 7, 2008].



Wolfe, DA, Wekerle, C, Scott, K. 1997. Alternatives to violence: empowering youth to develop health relationships. Thousand Oaks (CA): Sage.



YPulse. July 14, 2006. Ten Biggest Themes of ‘What Teens Want. Available online at http://ypulse.com/archives/2006/07/the_ten_biggest.php [Accessed on May 1, 2008].



www.Bloomberg.com



44


File Typeapplication/msword
Authormle4
Last Modified Bytfs4
File Modified2009-02-02
File Created2009-02-02

© 2024 OMB.report | Privacy Policy