0704-MCDS Integrated March 4-2025_OMB_Supporting_Statement_Part_B

0704-MCDS Integrated March 4-2025_OMB_Supporting_Statement_Part_B.docx

Military Child Development Program Workforce Survey and Case Studies

OMB:

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – PART B

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS - 0704-MCDS

1. Description of the Activity

The Office of the Deputy Assistant Secretary of Defense, Military Community & Family Policy (MC&FP) partnered with RAND’s National Defense Research Institute (NDRI) to analyze, identify, and offer solutions for factors contributing to the staffing issues that appear to influence the long wait lists for child care. The overall goal of the project is to collect information that will allow the Department to evaluate and then make informed decisions on ways to improve the strategies currently in use to recruit, train, and retain qualified staff within the Child Development Program (CDP).

The project consists of two components. First, RAND will conduct the Military Child Development Program Workforce Survey, an online survey of the CDP workforce, specifically those employed at child development centers (CDCs) serving children from birth to kindergarten entry and school-age care (SAC) available to children up through age 12. The second component of the project will conduct in-depth case studies at a limited number of military installations to learn more about CDP staffing issues, in the context of the local child care market, that cannot be readily addressed through the workforce survey. Where appropriate, we discuss each study component in response to the questions.

Survey Component

The workforce survey component seeks to address the following questions:

  • What are the characteristics of the CDP workforce in terms of their demographics, education and training background, work experience, veteran status, marital status and spouse military status?

  • What is the compensation that CDP workforce members receive in terms of pay and other benefits and what are the working conditions that they face?

  • What experiences did CDP staff have at the time of recruitment and what factors attracted CDP workforce members to work in child care and early learning setting?

  • What are the opportunities available to CDP workforce members for professional development and career advancement?

  • How satisfied are CDP workforce members in their jobs? What are their future plans for staying in their position or with the CDP more generally and what factors affect those plans?

  • What strategies could be used to attract qualified workers to the CDP, support their future career trajectory and professional development, and retain them in the field?

The target population for the online survey is the approximately 22,300 CDP staff who were employed in a CDC or SAC program across the Air Force, Army, Marine Corps, Navy, and Space Force as of 2022. This target population includes direct-care classroom staff (i.e., lead teachers and assistant teachers), center managers (e.g., directors and assistant directors), other professional staff (e.g., special education teachers, training and curriculum [T&C] specialists, behavioral specialists, nurses) and other support staff (e.g., administrative assistants, kitchen staff, and custodial and maintenance staff).1 Targeted staff will include those in both appropriated funds (APF) and non-appropriated funds (NAF) positions and those who work full-time, part-time, or as flex employees.

Table 1 shows the approximate number of staff in each of these categories and the expected response rates based on other similar surveys.

Table 1. Expected Number of Survey Participants

CDP Workforce Position

Total number (2022)

Expected Response Rate

Expected Number of Respondents

Direct Care Staff

17,680

25%

4,420

Management and Other Professional Staff

1,151

25%

288

Other Support Staff

3,494

15%

524

Total

22,325


5,232





Case Study Component

RAND will conduct case studies of the CDP at up to six installations. Each case study will consist of semi-structured interviews with a variety of CDP personnel and community stakeholders at an installation and its surrounding community, as well as a review of documents and workforce data. Interviews will be conducted during an in-person site visit to the installation and may also be conducted online and outside the timeframe for the in-person visit if participants prefer.

The case studies will collect information that helps the Department evaluate and improve its current strategies to recruit, train, and retain CDP staff. They will capture in-depth information on a range of topics not covered by any other data source. Specifically, the case studies will answer the following policy questions:

  1. What factors do CDP leaders, direct care professionals (including direct care staff and Family Child Care (FCC) Program providers2), support staff, and representatives of community-based child care organizations perceive as affecting the military services’ ability to recruit and retain CDP staff and FCC providers (e.g., pay and benefits, training requirements, career development opportunities, or working conditions)?3

  2. What factors do the above stakeholders perceive as affecting child care access, quality, and other important outcomes for the CDP?

  3. How do the above stakeholders evaluate proposed solutions to improve recruitment and retention of CDP staff, and what kinds of solutions do they envision?

Selection of Installations

RAND will use a non-probability purposive sampling method to select up to six installations from among all US military installations within and outside the continental US. In a purposive sample, researchers select sample members to represent specific characteristics in a population. For our case studies, we will select installations to achieve diversity in terms of geography (i.e., regions of the US), military services, and conditions that may represent challenges for delivering accessible and high-quality child care (e.g., relatively long waitlists, geographic isolation, or high local wages for occupations with similar entry requirements to child care). The source of these data will be a table of installations provided by our DoD sponsor, as well as publicly available information on installations, their surrounding communities, and their child care policy environments. Because we can conduct case studies at only a small number of installations, and because we need installations with diversity on key characteristics for the purposes of the study, purposive sampling is an appropriate method for sampling installations.

After selecting the sample, we will work with our DoD sponsor to obtain an installation-based point of contact to facilitate planning for each case study site, as well as contact information for CDP leaders at each installation. We expect that some sample installations we select may be unable to participate. However, we have no basis for estimating the number that will be unable to participate because this type of data collection has not been previously conducted. We will identify a set of six to 12 backup installations to replace sample installations that are unable to participate during the data collection timeframe.

Selection of Interview Participants

The broad categories of individuals whom we will interview for each case study include:

  • Installation and CDP leaders and subject matter experts

  • Direct care professionals, including direct care staff and family child care center (FCC) providers4

  • Support staff

  • Representatives of community-based child care organization, such as Child Care Aware

Table 2 shows the total number of participants in each category that we expect to interview. No centralized list of these individuals exists. Thus, it would be infeasible for RAND to report the total number of individuals in the potential respondent universe or sample from this population using probability sampling. Instead, RAND will use a combination of convenience and purposive sampling methods to sample from the population.

Table 2. Expected Number of Case Study Participants

Type of Case Study Participant

Participants Per Site

Total Participants

Installation and CDP leaders and subject matter experts

5

30

Direct care professionals

15

90

Support staff

6

36

Community stakeholders

1

6

Total

27

162

Installation and CDP Leaders and Subject Matter Experts

Approximately three months before each site visit, RAND will contact the installation’s CDP branch chief, flight chief, or equivalent and schedule an orienting interview to introduce them to the project and help us plan data collection. We will obtain names and contact information for a child development center (CDC) director or assistant director, school-age care center (SAC) director or assistant director, and local human resources (HR) subject matter expert and use this information to invite leaders and subject matter experts to participate in interviews.

Direct Care Professionals and Support Staff

To accommodate the schedules of direct care staff and support staff employed by CDCs and SACs, and to maximize the number of staff from whom we can collect data, we will use two data collection tracks in parallel: an onsite track and a remote track. Prior to each site visit, we will ask CDP leaders to provide installation CDC and SAC direct care and support staff with recruitment materials describing the interviews, providing our contact information, and inviting them to sign up for an interview. Direct care staff will be provided with a web-based form where they can volunteer for an interview time slot and interview mode (e.g., in-person or virtual) that aligns with their preferences. Responses to the web-based form will be collected through Qualtrics, a FedRAMP-certified web-based survey platform. RAND staff will collect contact information (e.g., names, email addresses, and phone numbers) and employment information (e.g., position, child care setting, and years spent working in their position and the military child development program) to support recruitment and scheduling of interviews for prospective interviewees. We will follow a similar strategy for recruiting FCC providers for interviews.

Community Stakeholders

We will coordinate with our DoD sponsor to identify and obtain contact information for relevant community-based organizations. In addition, we may identify community stakeholders using snowball sampling (i.e., asking participants in initial interviews to recommend additional community-based organizations to interview). Prior to each site visit, we will contact these individuals and request an interview.

We plan to interview a target number of individuals from each of the above categories at each installation. We expect that not all individuals who receive our recruitment outreach will choose to participate. However, we have no basis for reporting an estimated response rate since this type of data collection has not been previously conducted. We will continue to recruit and interview individuals until we have reached the target number for each category or until the designated period for recruitment and data collection ends.

2. Procedures for the Collection of Information

Survey Component



a. Statistical Methodologies for Stratification and Sample Selection

With the possible exception of CDC and SAC program administrators (e.g., directors), the DoD does not maintain a centralized list with the names and contact information of the target population for the survey, namely staff in the military Child Develop Program (CDP) workforce who are currently employed in a child development center (CDC) or a school age care (SAC) program. This target population includes direct-care classroom staff (i.e., lead teachers and assistant teachers), center managers (e.g., directors and assistant directors), other professional staff (e.g., special education teachers, training and curriculum [T&C] specialists, behavioral specialists, nurses) and other support staff (e.g., administrative assistants, kitchen staff, and custodial and maintenance staff). In the absence of a survey frame, it will not be possible to implement a sampling methodology.

In the absence of a survey frame, we will use a non-probabilistic methodology where all individuals in the target population referenced above will be eligible to respond. Examples of other DoD surveys that use a non-probabilistic approach include the supplemental survey portion of the 2021 Active Duty Spouse Survey (ADSS) (Docket ID - DOD-2021-OS-0021) (OMB Control Number 0704-0604); the EFMP Family Support Feedback Tool (Docket ID DoD-2023-OS-0033) (OMB Control Number 0704–FSFT), and the Interactive Customer Evaluation (ICE) family of data collections (Docket ID: DOD-2020-OS-0098) (OMB Control Number 0704-0420).

b. Estimation Procedures

The data collected through the survey will be analyzed using simple descriptive statistics such as means and medians and one- or two-way tabulations. Multivariate regression will be used as well for descriptive purposes to examine the correlations across multiple variables. Standard errors will be presented in order to measure the variation around the point estimates, but they will not be used for hypothesis testing. As sample sizes permit, we will provide summary descriptive information across the services to help convey the variation in our sample. However, we will not use statistical inference procedures (e.g., hypothesis testing) to analyze information collected through the survey, for the sample overall or for any subgroups.

For the several open-ended responses included in the survey, we will qualitatively code the responses and analyze those results using qualitative methods. As such, the number of estimated survey responses at several hundred cases (assuming 10 percent of survey respondents provide text in the open-ended questions) will be sufficient for theme saturation and convergence according to the standard qualitative analysis protocols (Bernard, 2000; Strauss and Corbin, 1990). Data codes will employ an iterative process to developing and refining themes that may emerge but were not a priori anticipated.

c. Degree of Accuracy Needed for the Purpose Discussed in the Justification

Given the purpose of collecting simple descriptive information for a non-probabilistic sample, we do not seek to use the data for statistical inference or hypothesis testing.

d. Unusual Problems Requiring Specialized Sampling Procedures

None to discuss.

e. Use of Periodic or Cyclical Data Collections to Reduce Respondent Burden

We expect respondent burden to be low for this one-time data collection.

Case Study Component



a. Statistical Methodologies for Stratification and Sample Selection

As described above, the case studies will not use probability sampling or stratification. Findings from interviews will not be generalizable to the population of CDP personnel and community stakeholders in the same way that estimates derived from a probability sample would be generalizable. However, through the use of purposive sampling we will seek to achieve variation on important installation characteristics, as well as variation in the types of child care personnel and stakeholders we interview (e.g., direct care staff, support staff, and different types of staff within these broad categories). This variation will enable us to identify and describe factors that contribute to staffing challenges as they are experienced by diverse personnel and stakeholders in different contexts. These data will provide us with a more complete picture of staffing challenges and potential solutions within the CDP and help DoD improve its current strategies to recruit, train, and retain CDP staff.

b. Estimation Procedures

The semi-structured interviews will collect qualitative data in the form of typewritten field notes documenting participants’ responses to questions contained in an overarching interview protocol. Because the data will be qualitative rather than quantitative, we will not use the kind of estimation procedures associated with a traditional survey (e.g., computing estimated means or proportions for the population and associated confidence intervals). Instead, we will use qualitative methods to analyze the data and report findings. Field notes will be imported into commercially-available qualitative analysis software (e.g., Dedoose or NVivo) and codes representing different topic areas will be applied to excerpts from the text. Excerpts will be exported by code and examined for themes, which will be summarized in a final report. We may describe themes that emerged for specific types of installations, CDP personnel, or community stakeholders. However, we will not use statistical estimation procedures to analyze information form the interviews for the sample overall or for any subgroups.

c. Degree of Accuracy Needed for the Purpose Discussed in the Justification

Does not apply.

d. Unusual Problems Requiring Specialized Sampling Procedures

Does not apply.

e. Use of Periodic or Cyclical Data Collections to Reduce Respondent Burden

We expect respondent burden to be low for this one-time data collection.



3. Maximization of Response Rates, Non-response, and Reliability

Survey Component

We will employ various methods to improve the response rate of the survey. Given our use of a non-probabilistic sample, we will rely primarily upon outreach strategies to increase awareness of the survey for those eligible to respond and to increase participation among those who are informed about their eligibility to take the survey.

In particular, we will engage in an array of outreach strategies to make the targeted workforce members aware of the survey and to encourage their participation. The strategies, in collaboration with MC&FP, will include:

  • Email notices delivered to CDC and SAC program (site) administrators based on current email contact information for CDP staff in these roles with a personalized link for them to take the survey and requesting that they engage in dissemination with their staff about the survey and inviting their participation.

  • Flyers distributed to all relevant CDC and SAC programs and centralized CDP administrative offices for posting on notice boards or for emailing to local staff notifying CDP workforce members of the existence and importance of the survey and inviting them to participate.

  • Announcement in relevant newsletters that reach the DoD CDP workforce about the survey and inviting participation; these communication vehicles will be identified at the Service-, region-, installation-, and command-level as appropriate.

  • Notices on the DoD Virtual Lab School (the DoD’s online training system), visible only to our targeted CDP staff, about the of the existence and importance of the survey and inviting participation.

  • A survey website with relevant materials about the survey including:

    • A letter from the Department of Defense, which will highlight the importance of the survey to the Department and how it will be used to inform policymaking to the benefit members of the CDP workforce.

    • A Frequently Asked Questions (FAQ) fact sheet indicating who is eligible to participate, outlining the content of the survey, providing assurances about confidentiality and other aspects of informed consent participation upon completing the survey.

    • Contact information for the study principal investigators and the survey helpdesk.

    • A web link to the online survey.

    • A date by which responses are requested.

These materials will be distributed when the online survey first becomes available and subsequently updated and distributed at 2 to 3 week intervals until the online survey is closed to further responses. For example, in subsequent updates, we will provide estimated response rates by service or installation to motivate increasing the participation rate among those who have not yet responded.

Despite efforts to minimize nonresponse, we still expect a high level of nonresponse, as is typical of non-probabilistic surveys. Given that we do not have a sample frame, it will not be possible to correct for nonresponse in the analysis of the data. When presenting findings from the survey, we will have clear statements in nontechnical language indicating that the survey may not be fully representative of the population of interest and that the findings are specific to the set of individuals who were made aware of the survey and chose to respond to the survey.

Case Study Component

We will use several methods to achieve high participation rates in interviews. As noted above, we will ask CPD leaders to communicate about the study with direct care staff and support staff in advance of each site visit. We will ask leaders to provide a one-page flyer with study information, our contact information, and an invitation to participate through the worksite’s regular communication channels. For example, we may ask leaders to distribute paper copies of the flyer during in-person staff meetings and describe the study verbally using a set of talking points that they can tailor or disseminate electronic copies of the flyer through work email with an email message that they can tailor. The flyer and any accompanying message will state that staff can participate in person during the upcoming site visit or schedule an online interview at a time that is convenient for them. Staff preferences will be collected through a web-based form hosted on Qualtrics. We will ask leaders to redistribute the flyer and remind staff about the site visit once per week until the site visit occurs.

For staff who prefer an in-person interview, we will request that CDP leaders allow staff to participate at a designated time during duty hours. For staff who prefer a virtual interview, we will use contact information collected through the web-based form to schedule online interviews via secure videoconferencing platform (e.g., Microsoft Teams). For individuals who request to schedule a remote interview outside the timeframe of the site visit but during duty hours, we will request that the CDC or SAC provide a location and computer for online interviews. Interviews may also be conducted from a non-work location during non-work hours if the participant prefers.

We will encourage participation by clearly stating in all communications that interviews will be confidential. At the beginning of each interview, the interviewer will review key information about the study, ask the participant if they have any questions, and ask the participant if they agree to participate. The interview protocol will be designed so participants answer only relevant questions, which will help maximize response rates.



4. Tests of Procedures

Survey Component

A DoD-wide survey of the CDP workforce has not been conducted in recent memory, if ever. The survey instrument was developed by drawing on questions from several recent surveys of members of the early childhood or early elementary grades workforce that were previously fielded. These surveys, all collected outside the DoD where sample frames were generally available, include:

  • Texas Director Survey (Prenatal-to-3 Policy Impact Center, 2023)

  • Hawaiʻi Survey of Early Childhood Educators (Karoly et al., 2022)

  • State of the American Teacher and State of the American Principal Surveys (Doan et al., 2022)

  • Army Child and Youth Services Early Childhood Educators Survey (Nutall, 2021)

  • Quality Start Los Angeles Early Childhood Workforce Survey (Gomez et al., 2021)

  • New Hampshire Preschool Development Grant Early Childhood Workforce Survey (Karoly et al., 2020)

  • National Survey of Early Care and Education (2019)

  • Organisation for Economic Co-operation and Development (OECD) Teaching and Learning International Survey (TALIS) Starting Strong (Sim et al., 2019).

With the exception of Nutall (2021), these prior surveys focused on the civilian child care sector. For this reason, questions from these sources were modified as appropriate based on (a) results from these prior surveys, (b) the goals of the current study, and (c) differences in the military child care system relative to the civilian child care context. We also iterated with and incorporated feedback from the DoD sponsor in developing the survey instrument. We further tested the survey instrument by conducting cognitive interviews with up to nine respondents. Further revisions were made, as needed, based on findings from those cognitive interviews.

Case Study Component

RAND uses best practices in the design and execution of interviews and in the subsequent analysis of qualitative data collected from interviews. To develop the interview protocol, the project team reviewed and leveraged questions from prior studies of military child care, tailoring the questions as needed to the goals of this study. The RAND team includes researchers and support staff with extensive experience developing interview protocols, conducting interviews, and analyzing interview data.



5. Statistical Consultation and Information Analysis

a. Names and telephone number of individual(s) consulted on statistical aspects of the design.

Claude Setodji, Ph.D., RAND, setodji@rand.org, (412) 683-2300 x4920

b. Name and organization of person(s) who will actually collect and analyze the collected information.

The RAND project team will collect and analyze the survey data under the supervision of the project’s co-principal investigators:

Laura Werber, Ph.D., RAND, lauraw@rand.org, (310) 393-0411 x6897

Lynn Karoly, Ph.D., RAND, karoly@rand.org, (310) 393-0411 x5359



References Cited

Bernard, H., Social Research Methods: Qualitative and Quantitative Approaches, Thousand Oaks, CA: Sage, 2000.

Doan, Sy, Lucas Greer, Heather L. Schwartz, Elizabeth D. Steiner, and Ashley Woo, State of the American Teacher and State of the American Principal Surveys: 2022 Technical Documentation and Survey Results, RAND, RR-A1108-3, 2022. As of May 7, 2023: https://www.rand.org/pubs/research_reports/RRA1108-3.html

Gomez, Celia J., Anamarie A. Whitaker, Jill S. Cannon, Susannah Faxon-Mills, and Mallika Bhandarkar. (2021), The Quality Start Los Angeles Developmental Evaluation: Research Findings and Lessons Learned, RR-A249-3, Santa Monica, CA: RAND.

Karoly, Lynn A., Jill S. Cannon, Celia J. Gomez, and Ashley Woo, Early Childhood Educators in Hawaiʻi: Addressing Compensation, Working Conditions, and Professional Advancement, RAND, RR-A1908-1, 2022.

National Survey of Early Care and Education, “NSECE 2019 Questionnaires,” webpage, undated. As of May 7, 2023: https://nsece.wordpress.com/nsece-2019-questionnaires/

Nuttall, Tamara. Army Child and Youth Services Early Childhood Educators: A Mixed-Methods Assessment of the Association Between Workplace Wellbeing and Turnover (Dissertation, Concordia University, St. Paul). Retrieved from https://digitalcommons.csp.edu/edd/14

Prenatal-to-3 Policy Impact Center. Workgroup Recommendations to Inform the 2022 Child Care Workforce Strategic Plan. Prenatal-to-3 Policy Impact Center, 2023. As of May 7, 2023: https://pn3policy.org/wp-content/uploads/2022/12/PN3PIC_Workgroup-Recommendationsto-Informthe-2022StrategicPlan.pdf

Schlieber, Marisa, Marcy Whitebook, Lea J. E. Austin, Aline Hankey, and Michael Duke, Teachers’ Voices: Work Environment Conditions That Impact Teacher Practice and Program Quality–Marin County, Berkeley, Calif.: Center for the Study of Child Care Employment, 2020.

Sim, Megan P. Y., Julie Bélanger, Agnes Stancel-Piątak, and Lynn A. Karoly, Starting Strong Teaching and Learning International Survey 2018: Conceptual Framework, OECD Education Working Paper No. 197, Paris, France: OECD, 2019.

Strauss, A. and J. Corbin, Basics of Qualitative Research: Grounded Theory Procedures and Techniques, Newbury Park, CA: Sage, 1990.



1 A more detailed list consists of the following staff positions: Administrative Assistants, Administrative Support Staff, Assistant Directors, Program Assistants, Program Leads, Cooks, Custodial Workers, Administrators, Assistance Directors, Coordinators, Facility Directors, Health Specialists, Instructional Programs Specialists, Program Associates, Program Specialists, Training Specialists, Program Liaisons, Laborers, Maintenance Workers, Vehicle Operators, Nurses, Nutritionist, Office Aides, Facility Specialists, Technicians, and Teachers.

2 In the Navy, these providers are referred to as Child Development Home (CDH) providers, but for parsimony we use the term FCC provider to encompass all providers who are certified by DoD to offer home-based care.

3 FCC providers are independent contractors and do not receive pay and benefits in the same way as CDP employees. However, we understand that the CDP needs to recruit FCC providers to and sustain participation of FCC providers in the program. Thus, we may examine factors such as the CDP’s payments to FCC providers and other supports as part of Policy Question 1.

4 In the Navy, these providers are referred to as Child Development Home (CDH) providers, but for parsimony we use the term FCC provider to encompass all providers who are certified by DoD to offer home-based care.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPatricia Toppings
File Modified0000-00-00
File Created2025-07-15

© 2025 OMB.report | Privacy Policy