C CWIS Self-Assessment Tool CCWIS Design Requirements
OMB # 0970-0568
Expiration Date: 0 4/30/2024
The design requirements at 45 CFR 1355.53 read:
(a) Except as exempted in paragraph (b) of this section, automated functions contained in a CCWIS must:
(1) Follow a modular design that includes the separation of business rules from core programming;
(2) Be documented using plain language;
(3) Adhere to a state, tribal, or industry defined standard that promotes efficient, economical, and effective development of automated functions and produces reliable systems; and
(4) Be capable of being shared, leveraged, and reused as a separate component within and among states and tribes.
(b) CCWIS automated functions may be exempt from one or more of the requirements in paragraph (a) of this section if:
(1) The CCWIS project meets the requirements of section 1355.56(b) or (f)(1); or
(2) ACF approves, on a case-by-case basis, an alternative design proposed by a title IV-E agency that is determined by ACF to be more efficient, economical, and effective than what is found in paragraph (a) of this section.
The target outcomes for a CCWIS implementation is to design a system that is modular so it can meet the ever evolving business requirements of a title IV-E agency (henceforth, called “agency”), and promotes the successful integration of the chosen implementation and infrastructure into a seamless functional CCWIS. A high-quality modular system has an overall design strategy that leverages a modular architecture upon which modular software components, such as automated functions, are built.
ACF will review, assess, and inspect the planning, design, development, installation, operation, and maintenance of each CCWIS project on a continuing basis, in accordance with APD requirements in 45 CFR part 95, subpart F, to determine the extent to which the project meets the requirements in §§ 1355.52, 1355.53, 1355.56, and, if applicable, § 1355.54. The ongoing ACF reviews, assessments, technical assistance, and inspections that typically occurs before the CAR are referred to as “TA process”, “TA activities” and/or “TA reviews.” ACF will not review every automated function developed within the CCWIS for design conformity. Rather, during development of the system, TA activities and reviews, and post implementation CCWIS compliance reviews, ACF and the agency will agree on a select set of automated functions for review. ACF may ask agencies to perform a self-assessment before providing TA activities, or engaging in other types of review, using this tool.
This self-assessment tool is divided into sections as outlined on the chart below. Every question has a unique Element # for easy reference. Please refer to the instructions in Technical Bulletin #7 or contact your federal analyst if you have questions about the tool or a specific element.
Section |
Element # |
Overview and Background Information |
I.A.xx |
Self-Assessment – Part 1 – CCWIS Design Requirement Goals for Modular Design |
I.B1.xx |
Self-Assessment – Part 2 – CCWIS Design Requirement Goals for Plain Language |
I.B2.xx |
Self-Assessment – Part 3 – CCWIS Design Requirement Goals for Design and Development Standards |
I.B3.xx |
Self-Assessment – Part 4 – CCWIS Design Requirement Goals for Sharing, Leveraging, and Reusing CCWIS Automated Functions |
I.B4.xx |
Resources and Additional Considerations |
I.C.xx |
Guidance on CCWIS Design Requirement Goals for Modular Design – 1355.53(a)(1) |
I.C.1 |
Guidance on CCWIS Design Requirement Goals for Plain Language – 1355.53(a)(2) |
I.C.2 |
Guidance on CCWIS Design Requirement Goals for Design and Development Standards – 1355.53(a)(3) |
I.C.3 |
Guidance on CCWIS Design Requirement Goals for Sharing, Leveraging, and Reusing CCWIS Automated Functions – 1355.53(a)(4) |
I.C.4 |
Overall CCWIS Conformance with Design Requirements Considerations |
I.C.5 |
Exemption from Conformance with CCWIS Design Requirements (1355.53(b)) |
I.C.6 |
Sample Scenarios of Different Implementations that ACF would consider Eligible for Design Review |
I.C.7 |
Evaluating Conformance with CCWIS Design Requirements: Pilot Methodology and Sample Scoring Sheet |
I.C.8 |
Resources |
I.C.9 |
The Overview and Background Information section collects overall technology information on the automated function and the system as a whole. A title IV-E agency may use this self-assessment tool to collect information that ACF may ask agencies during an assessment of one of more automated functions’ conformance to CCWIS design requirements. Agencies may cross-reference information if it is already contained in an APD or project artifact. Answers should be clear and concise. If a question is not applicable, enter “N/A” and provide a reason.
I.A.01 Provide a brief overview of the purpose and behavior of the automated function.
|
I.A.02 Describe the high-level system architecture the agency used and where the automated function fits into it. Include an overview of associated infrastructure, platforms, services, software components, exchanges, and other tools and technologies. Specific implementation plans are documented in the APD and the agency may reference the applicable APD(s) or document information here.
|
I.A.03 Describe the technical design of the automated function itself, and its system requirements. The agency may reference system documentation or project artifacts.
|
I.A.04 Describe how other systems or components can use the CCWIS automated function. What interfaces/APIs or other mechanisms are available for exchanging data and leveraging supported functionality? ACF makes no assumptions or recommendations about application architecture: these interfaces could be in a library loaded into a single process or more commonly these days a web API connecting multiple processes. ACF expects that agencies build interfaces readily understandable so potential adopters can evaluate whether a module might be a good fit for their system and how to call the module functionality when integrating it into their system.
|
I.A.05 What is the current implementation status of your automated function? Is the automated function and related documentation ready to be shared through the federal software repository to facilitate reuse (as permissible)?
|
I.A.06 Are there any additional comments you would like to provide as background to the design of this automated function?
|
In this section, the agency may document components, factors, and design elements of the functions(s) or exchanges that support the design goals of the CCWIS. If the agency has additional goals, please include them below and add new rows as needed. We encourage agencies to simplify their responses by referencing submitted documentation, such as APDs or attach artifacts such as general and detailed design documents, logical data models, and test plans.
Please answer each question fully. If a goal is not applicable to the CCWIS, indicate “N/A” and explain why it is not applicable.
Part 1 – CCWIS Design Requirement Goals for Modular Design at 1355.53(a)(1)
Assess whether the automated function’s modular design separates business rules from core programming. Core programming may be code other than business logic, such as code for interfaces and data access layers. Refer to section C for additional guidance on modular design goals.
# |
Modular Design Goal |
Evidence the System Supports the Goal |
I.B1.01 |
Architecture Pattern: The CCWIS or automated function institutes an architectural pattern that incorporates an 'n-tier' layered design or other structured topology specifying architecture components with clear roles, responsibilities, and relationships.
Typically demonstrated in architecture and design documentation of the overall CCWIS. |
|
I.B1.02 |
Business Rules: The CCWIS business rules are separated from the core programming.
Typically demonstrated through design documentation that reflects how business rules are segregated into a separate layer or component. Business rules may also be independently managed in association with a distinct business rules engine. |
|
I.B1.03 |
Rules Engine: The agency uses a business rules engine to define the business rules for the CCWIS automated functions.
Typically demonstrated in design documentation. |
|
I.B1.04 |
Testing: A set of unit tests are present to verify implementation of business rules.
Typically demonstrated by sharing a collection of test cases pertaining to business rule functionality. |
|
I.B1.05 |
Coupling: The automated function has been designed with clear boundaries.
Typically demonstrated through a design document or interface control document (ICD) that details and describes the set of interfaces for the automated function. |
|
I.B1.06 |
Coupling: The automated function does not require other automated functions to perform its tasks.
Typically demonstrated through a design document that details and describes the automated function dependencies. |
|
I.B1.07 |
Coupling: The automated function efficiently communicates with other automated functions within the CCWIS.
Typically demonstrated through design document process views and communication architecture descriptions that denote how communications components are linked and describes communications flow between CCWIS components. |
|
I.B1.08 |
Coupling: Identified automated function is easily severable from CCWIS.
Typically demonstrated through reference to automated function installation, setup, configuration, and usage information independent of overall CCWIS documentation. |
|
I.B1.09 |
Cohesion: The identified automated function reflects a discrete, easily defined purpose that does not significantly overlap with any other automated function within the CCWIS.
Typically demonstrated by a simple purpose description in automated function requirements and design documentation. |
|
I.B1.10 |
Cohesion: The automated functions’s functionality is designed to meet the needs of a business function performed by the agency.
Typically demonstrated by documenting stakeholder input to automated function design, such as through a user-centered design process, and through user acceptance testing results. |
|
I.B1.11 |
Cohesion: Agency staff (and their business partners) who perform the business function, being supported by the automated function, were given an opportunity to participate in designing the automated function.
Typically demonstrated by documenting stakeholder input to automated function design, such as through a user-centered design process. |
|
I.B1.12 |
Computer Generated: The agency uses automated tools to generate code in the CCWIS.
|
|
Part 2 – CCWIS Design Requirement Goals for Plain Language at 1355.53(a)(2)
Assess whether documentation is easy to read and understand. Refer to section C for additional guidance on plain language goals.
# |
Plain Language Goal |
Evidence the System Supports the Goal |
I.B2.01 |
Know Your Audience: Agency staff writes the topic with a familiarity to the audience, defining why the audience needs this document, and for all levels of staff to understand.
Typically demonstrated by clear identification of target audiences across automated function documentation, and by minutes and other records of documentation reviews that include references to inspection of fit to audience. |
|
I.B2.02 |
Organize Your Thoughts: The document is organized to provide clear and concise points.
Typically demonstrated by records of documentation reviews that include references to inspection of clarity and conciseness. |
|
I.B2.03 |
Summarize Main Points: Documentation uses formatting, headings, lists, tables and other visual cues to create a structure that enables easier location of information and better engagement of readers.
Typically demonstrated by records of documentation reviews that include references to inspection of documentation for use headers and lists. |
|
I.B2.04 |
Write Short Sentences and Paragraphs: Documentation is comprised of concise sentences. Documentation provides an initial context for the ideas that will be discussed and incorporates definitions into the text. The paragraphs are simple with one topic sentence and one idea developed throughout the paragraph.
Typically demonstrated by records of documentation reviews that include references to inspection of sentence and paragraph simplicity and conciseness. |
|
I.B2.05 |
Use Every Day Phrases and Words: Documentation speaks to the audience (at all levels of expertise) and does not use extraneous words in Documentation construction.
Typically demonstrated by records of documentation reviews that include references to inspection of sentence and paragraph simplicity and conciseness. |
|
I.B2.06 |
Limit Use or Do Not Include any Technical Jargon: Documentation does not include or limits the use of technical jargon, does not use abbreviations and explains acronyms.
Typically demonstrated by records of documentation reviews that include references to inspection of technical jargon. |
|
I.B2.07 |
Use Strong Subjects and Verbs: Documentation is composed with strong subjects and verbs, it uses active voice where possible and keeps the sentence structure simple.
Typically demonstrated by records of documentation reviews that include references to inspection of using strong subjects and verbs, and of active voice. |
|
I.B2.08 |
Define Uncommon Terms: Documentation defines uncommon terms in the body of Documentation as well as within a glossary.
Typically demonstrated by records of documentation reviews that include references to inspection for undefined uncommon terms. |
|
I.B2.09 |
Proof-Read and Edit: Documentation is free of grammatical error.
Typically demonstrated by records of documentation reviews that include references to inspection for grammatical errors. |
|
Part 3 – CCWIS Design Requirement Goals for Design and Development Standards at 1355.53(a)(3)
Assess whether design and development standards were used and adhered to during development of the automated function. Refer to section C for additional guidance on design and development standard goals.
# |
Design and Development Standards Goal |
Evidence the System Supports Goal |
I.B3.01 |
Adherence to Standards: The agency developed and conducted a process for evaluating adherence to design and development standards.
Typically demonstrated by identifying leveraged design and development standards and agency’s process of review. |
|
I.B3.02 |
Adherence to Standards: The agency acquired or leveraged autonomous quality management (QM) or independent verification and validation (IV&V) services to monitor the project during development.
Typically demonstrated through records of services acquisition. |
|
I.B3.03 |
Adherence to Standards: The agency adheres to its design and development standards for the period under review.
Typically demonstrated through records of review sessions and associated changes made to adhere to standards. |
|
I.B3.04 |
Adherence to Standards: The agency trains staff on standards used and where they can be found.
Typically demonstrated by identifying leveraged development standards and agency’s process of review. |
|
I.B3.05 |
Adherence to Standards: The agency performs code reviews to determine the quality of the code produced.
Typically demonstrated through code review minutes or logs. |
|
I.B3.06 |
Adherence to Standards: The agency confirms adherence to design and development standards during internal project and code reviews.
Typically demonstrated through internal project and code review minutes, logs, or associated version control repositories showing changes made to adhere to standards. |
|
I.B3.07 |
Written Documentation of Standards: The agency maintains written documentation of the software design and development standards used for automated functions designed for the CCWIS.
Typically demonstrated through reference to standards documentation. |
|
I.B3.08 |
Written Documentation of Standards: Data sharing agreements are based on agency data exchange standards.
Typically demonstrated by referencing and detailing alignment of data sharing agreements and agency data exchange standards. |
|
I.B3.09 |
Written Documentation of Standards: Standards used for automated functions are based on state, tribal, and/or industry-defined standards.
Typically demonstrated by referencing leveraged standards pertaining to developing reusable CCWIS automated functions, and association with state, tribal, and/or industry standards. |
|
I.B3.10 |
Written Documentation of Standards: The agency maintains written documentation of the standards on commercial-off-the-shelf (COTS), or software-as-a-service (SaaS) automated functions, if applicable.
Typically demonstrated by referencing leveraged standards pertaining to using COTS and SaaS for CCWIS automated functions. |
|
I.B3.11 |
Efficient, Economical, Effective (“Three E’s”): The automated function functions as designed.
Typically demonstrated with automated function user-acceptance testing (UAT) reports. |
|
Part 4 – CCWIS Design Requirement Goals for Modular Design at 1355.53(a)(1) for Sharing, Leveraging and Reusing CCWIS Automated Functions at 1355.53(a)(4)
Assess whether the automated function can be effectively shared, leveraged, and reused. Refer to section C for additional guidance on sharing, leveraging, and reusing goals.
# |
Share, Leverage, and Reuse Goal |
Evidence the System Supports the Goal |
I.B4.01 |
Share - Included Metadata: Automated function is easily identifiable via a unique name that does not conflict with an existing project and does not infringe on trademarks.
Typically demonstrated through clear specification and use of unique automated function name. |
|
I.B4.02 |
Share - Included Metadata: The source, contributor, and points-of-contact for the identified automated function are clearly specified.
Typically demonstrated through clear documentation of specified information in association with the automated function. |
|
I.B4.03 |
Share - Included Metadata: Product status, version information and release notes for the automated function are provided.
Typically demonstrated through reference to automated function product and release information. |
|
I.B4.04 |
Share - Included Metadata: Automated function licensing information is provided.
Typically demonstrated through reference to automated function product licensing information; normally included as a text file along with the code. |
|
I.B4.05 |
Share - Included Metadata: A product README file and links to more comprehensive documentation for the automated function are provided. (A README file is usually a simple plain text file that contains information about other files in a directory or archive of computer software.)
Typically demonstrated through reference to automated function README. |
|
I.B4.06 |
Share - Policy and Procedures Management: Automated function is accompanied by information describing the process and plans for maintaining, updating, and ending support for code.
Typically demonstrated through reference to automated function product roadmap. |
|
I.B4.07 |
Share: An issue queue is available to view and track progress on known bugs, enhancement requrests, and other issues.
(This indicator considered N/A until procedures are available established for C-SWAP.) |
|
I.B4.08 |
Share: Communication channels and feedback mechanisms are available to allow automated function recipients to query maintainers and get answers to questions.
(This indicator considered N/A until procedures are available established for C-SWAP.) |
|
I.B4.09 |
Share - Unique Purpose: Identified automated function subsumes features that may be enabled, disabled, configured, or removed.
Typically demonstrated through reference to automated function setup, configuration, and customization documentation. |
|
I.B4.10 |
Share - Demonstrated Test Coverage: Identified automated function is accompanied by evidence, such as test plans and results, of comprehensive testing.
Typically demonstrated in association with test plans and test reports that indicate the proportion of code executed. |
|
I.B4.11 |
Leverage - Clear Requirements Documentation: Automated function is accompanied by comprehensive documentation on features and functionality.
Typically demonstrated through reference to a systems/software requirements specification (SRS) or similar documentation. |
|
I.B4.12 |
Leverage - Security and Compliance: Automated function is accompanied by reports describing the results of performed vulnerability testing.
Typically demonstrated through reference to vulnerability test reports. |
|
I.B4.13 |
Leverage - Security and Compliance: Automated function is assessed against relevant security and privacy controls such as the National Institute of Standards and Technology Special Publication 800-53 (NIST SP 800‑53).
Typically demonstrated through reference to a System Security Plan (SSP) and associated control documentation. |
|
I.B4.14 |
Leverage - System requirements, installation, integration, configuration, and administration procedures: Automated function is accompanied by a software installation plan (SIP) or other documentation detailing system requirements and installation procedures.
Typically demonstrated through reference to a SIP or similar documentation. |
|
I.B4.15 |
Leverage - System requirements, installation, integration, configuration, and administration procedures: Automated function is accompanied by documentation detailing required and recommended configuration information.
Typically demonstrated through automated function setup and configuration documentation such as a system security plan. |
|
I.B4.16 |
Leverage - System requirements, installation, integration, configuration, and administration procedures: Available documentation details external interfaces and integration points to allow system integrators to incorporate and leverage the automated function.
Typically demonstrated through automated function integration requirements documentation. |
|
I.B4.17 |
Leverage - System requirements, installation, integration, configuration, and administration procedures: Automated function is accompanied by an administration manual or procedures to facilitate effective system administration.
Typically demonstrated by reference to an administrator’s guide, operations and maintenance (O&M) manual, or similar documentation. |
|
I.B4.18 |
Reuse – Framework: Automated function is architected to leverage established software frameworks and established, industry-standard underlying design patterns..
Typically demonstrated in architecture and design documentation. |
|
I.C.1 Guidance on CCWIS Design Requirement Goals for Modular Design – 1355.53(a)(1)
The CCWIS modular design requirement requires separating business rules from core programming (see 1355.53(a)(1)). In the requirement, core programming may be code other than business logic, such as code for interfaces and data access layers. Software systems broken up into modules follow a principle of separation of “concerns” or tasks. The separation of concerns creates layers within programming, with each layer specializing in the “type” of task it performs. In addition, the business rules should be encapsulated, reusable, allow for substitution, and provide a well-defined interface which can be used by internal and external systems.
Modular Design with Separation of Business Rules from Core Programming
Multi-layer Modular Architecture: A multi-tiered (n-tier) and multi-layered application architecture is widely used by industry, and refers to the separation of the application into physical tiers and logical layers with specific roles and responsibilities. Most applications with a modular design will also have a multi-layered design. An application typically comprises presentation (user interface), service, business logic (rules), and data access (persistence) layers. ACF will determine whether an automated function follows the principle of the separation of business rules from core programming. Separation of business rules from core programming may be achieved by determining whether the code follows a layered architecture. ACF will determine whether the application is separated into at least three layers. An agency may develop more layers in their code.
Coupling: Coupling within any information system happens on a scale from complete independence to monolithic systems.
At a high level, coupling addresses whether modules can function independent of other modules. If so, they are considered weakly or loosely coupled; whereas, if they cannot, then they are considered strongly or tightly coupled. Systems built of tightly coupled modules do not have clear boundaries between the functionality of one module versus that of the next. For example, a CCWIS implementation that includes an intake module loosely coupled with an investigations module, allows the agency to substitute or maintain the intake module without disrupting the functioning of the investigations module.
Effective coupling within a CCWIS means that system-wide functions and data/input dependencies should be specified and kept to a minimum.
Cohesion: A modular design approach applies to both the system and the internal design of each module. CCWIS modules are expected to perform a single action or set of actions to meet an objective. Cohesion describes the extent to which like functions are grouped together in CCWIS modules.
Agencies define what automated functions are within their CCWIS systems and what functionality is included within those automated functions. Generally, agencies are encouraged to identify the business functions of the agency and build automated functions around supporting those business needs1.
Business Rules Engine: Agencies may use a business rules engine to help define the business rules for the CCWIS. This engine may be a state/tribal-developed engine, one provided by a vendor, or purchased on the open market. Business rules engines may help projects adhere to the development standards selected for the CCWIS project and help keep the business rules in their separate layer of the coding.
Computer-Generated Code and Configuration: Modern computer programming can be accomplished using technology to automate coding. If the agency uses techniques that automatically generate code, then the agency must ensure that the generated code meets the design standards set by the agency, and that the code fits within the N-tier architecture of the CCWIS. ACF may review any code that makes up an automated function, regardless of whether it was human or computer-generated,.
I.C.2 Guidance on CCWIS Design Requirement Goals for Plain Language – 1355.53(a)(2)
Documentation must be easy to read and understand. Documentation may include, but is not limited to, system documentation, operations documentation, installation documentation, integration documentation, configuration documentation, software design documentation, test suites proving function correctness, user-stories, use cases, product backlog, product increment, programming documentation, and user documentation (including screen help, training materials, and user manuals).
Examples of documentation that should adhere to the plain language requirement include:
automated functions documented within the automated functions checklist that support CCWIS;
any programs and report documentation;
any programming code and supporting documentation for bi-directional data exchanges;
data exchange standards documented both in program code and system documentation for bi-directional data exchanges and for data exchanges with systems; and
Data quality plans and data quality biennial review documents.
ACF will assess the documentation to look for adherence to plain language usage, logical flow and progression, and usage of industry standard terminology.
The plain language standards can be categorized into 5 (five) main areas: defining the audience, organizing the document, design and formatting of the document, writing the document, and editing the document.
I.C3 Guidance on CCWIS Design Requirement Goals for Design and Development Standards – 1355.53(a)(3)
Agencies often have their own standards or inherit standards from overarching information technology divisions within their governments. It is common for Chief Information Officers and Chief Technology Officers to set standards for agencies to follow when producing, maintaining, and operating information systems. Agencies are often required to affirm that they will follow such standards during procurement processes to acquire the goods or services to produce, maintain, or operate these systems.
A title IV-E agency can submit the design and development standards anytime during the project life cycle, but the agency must submit the standards before the CAR. During TA activities, ACF may ask the agency to walk through agency design standards and respond to questions. The agency may be asked to demonstrate that the standards were used and adhered to during development of the CCWIS, and demonstrate the efficiency, economy, and effectiveness of the design and development standards and that they produce reliable systems.
Demonstrate that the Standards were Used and Adhered to During Development of the CCWIS
During TA activities and reviews, ACF may request:
reports from Independent Verification and Validation (IV&V) vendor(s) and Quality Management (QM) vendors who monitored the project during development;
training materials that inform staff of the standards that the agency uses or provides a location where the standards may be reviewed;
agency analysis and findings from their technical review of a vendor’s products;
documentation of compliance with the standards during a documentation and code review; and
To see a demonstration of the CCWIS functionality.
Submission of Written Documentation of Agency Standards
Agency-defined standards include design standards developed by the agency, or adopted from other internal state or tribal bodies of standards. As an alternative to agency defined standards, agencies that have adopted industry standards may submit their documentation by providing web links to the entity’s web site, if they do not keep an internal copy of the standards. If the agency has customized industry standards, then those customizations should be documented by the agency.
The CCWIS may use modules from multiple sources that may have been produced with different development standards than the rest of the CCWIS. For instance, the CCWIS may use a module from the C-SWAP federal software repository designed and built for another agency. For each source of technology within the CCWIS (agency built, acquired via purchase, acquired via another agency, etc.), the agency should know the design standard used and have documentation of that standard. The agency should be prepared to produce and submit documentation of standards used for any automated function in their CCWIS, regardless of the source of the technology. ACF will not review internal adherence to standards within proprietary products that may be exempted from CCWIS design requirements.
Demonstrate the Efficiency, Economy and Effectiveness of the Development Standard and that it produces Reliable Systems
The agency should consider periodically reviewing their standards to ensure they are still relevant and up to date.
A standard that supports efficient, economical and effective development of the CCWIS does not hinder the project team during implementation and enhancements. During TA activities and/or a TA review, ACF will focus on the adherence to the design and development standards to assess their efficiency and effectiveness in development of reliable automated functions. The agency must demonstrate that the finished automated function functions as designed, with minimal issues.2
I.C4 Guidance on CCWIS Design Requirement Goals for Sharing, Leveraging, and Reusing CCWIS Automated Functions – 1355.53(a)(4)
This section of the self-assessment presents mechanisms by which ACF may determine whether an identified automated function complies with requirements that the function may be effectively shared, leveraged, and reused by other states and tribes.
Share
These attributes of a CCWIS automated function will facilitate effective sharing:
Included metadata: The shared CCWIS automated function should be associated with a set of metadata and high-level documentation that uniquely identifies the software and provides critical information. Key metadata should include the creator/contributor, points of contact, product name, product version, release notes, product license (public domain, open source license, proprietary), and product development status (for example, under active development). A README document should also be included, and links to additional documentation.
Policy and procedures management: Shared CCWIS automated functions should come with policy and information that describe plans and processes for the management, on-going development, maintenance, updating, and disposition of code. Plans may incorporate elements such as feature roadmaps and software end-of-life (EOL). An issue queue should be established, and knowledge transfer mechanisms should be specified for communicating and addressing issues. Likewise, avenues for responding to community queries and other interactions (e.g., recommended code changes to address bugs) could be established.
Unique purpose: Modules should contain related resources that enable them to accomplish a task. General software best practice entails having many small and focused modules to promote code reuse and turn those modules into effective building blocks. Extending this to CCWIS automated functions means that each automated function should reflect a singular overall purpose. Likewise, to the degree that a module contains multiple (albeit related) features, mechanisms should be provided to enable, disable, configure, or even remove those features.
Severable: The automated functions within a CCWIS are more easily shared by states and tribes if those systems are built with severable components. A severable component is readily removed from its usage context. Usually, this means that an automated function should leverage standardized messages and common interfaces so external components’ connectors need know nothing of the internal functions or data encapsulated within the module.
Demonstrated test coverage: To instill confidence in the quality of a shared CCWIS automated function, evidence of adequate test coverage should be presented. While there are a range of measures of test coverage, which are associated with the leveraged coverage item (such as lines of code, subroutines, paths, or number of scenarios under test for a business process), evidence of automated functional testing that traces tests back to developed functional requirements is optimal.
Leverage
These aspects of a shared CCWIS automated function will facilitate its effective use:
Clear requirements documentation: When considering the use of a CCWIS automated function, evaluators should have a fundamental and comprehensive understanding of its purpose, features and functionality. An effective representation of the functional and non-functional requirements (or equivalent variations such as agile user stories) for an automated function is often a reflection of the quality of the product and its underlying code, given that clear and analyzed requirements are a critical element for achieving high-quality software development.
Security and compliance: To leverage a CCWIS automated function, the agency should identify how much that automated function complies with security and privacy requirements. Security requirements are specified in a wide range of federal laws, executive orders, National Institute of Standards and Technology (NIST) special publications (SP), Federal Information Processing Standards (FIPS), and Office of Management and Budget (OMB) and Government Accounting Office (GAO) circulars and guidelines. Leveraging government and industry tools and standards – NIST, Center for Internet Security (CIS), Open Web Application Security Project (OWASP), and Security Content Automation Protocol (SCAP) – automated functions should be evaluated for security vulnerabilities and, in association with intended usage and potential integrations, reviewed against security controls.
System requirements, installation, integration, configuration, and administration procedures: System integrators and administrators should have installation, configuration and other support documentation to effectively leverage shared CCWIS automated functions. Documentation including step-by-step procedures may be augmented by automated configuration routines.
Reuse
These aspects of a shared CCWIS automated function will facilitate its effective and continual reuse:
Frameworks: Use of well-established, standards-based software frameworks eases the on-going maintenance and integration of an automated function. Use of flexible and extensible frameworks can also reduce efforts required to customize the automated function based on specific needs.
Design patterns: By leveraging standard design patterns to address recognized needs, CCWIS automated functions leverage best-practice structured approaches that are more readily understood and reused.
I.C5 Overall CCWIS Conformance with Design Requirements Considerations
These considerations may aid agencies in implementing CCWIS systems that promote development of automated functions that adhere to CCWIS design requirements and meet modular design goals.
The design approach adheres to established data standards (including data exchange standards) to facilitate the integration of all modules in the CCWIS for ensuring reliable quality data and an effective user experience.
CCWIS architecture governs and integrates each module from a functional, user experience, interface and data management, and shared service perspective.
Every automated function integrates into the overall CCWIS. (While each module is separate, stand-alone and functionally capable, every automated function must be part of a larger eco-system that results in the CCWIS solution.)
The CCWIS design approach allows the agency to adjust to changing business needs, or if a module does not perform, promptly.
The design approach allows the agency to efficiently and economically improve and extend the CCWIS solution.
The CCWIS has a consistent user experience throughout all of its modules.
Project documentation is written using plain language.
The project regularly meets release deadlines.
The project remains within scope and budget, as defined in the APDs.
Systems can monitor and measure system availability, e.g., establishing measures to meet four nines of availability (99.99%) and beyond.
The CCWIS has a fail-safe for when it stops functioning. This may be backup servers or other technology that allows the production environment to seamlessly operate if failure occurs.
The CCWIS has disaster recovery procedures, if catastrophic failure occurs.
I.C6 Exemption from Conformance with CCWIS Design Requirements (1355.53(b))
ACF will consider the exemption status of functionality within the CCWIS, when considering what is eligible to review regarding CCWIS design requirements. ACF will review CCWIS automated functions for compliance with the CCWIS design requirements, and for overall CCWIS design. ACF will not review any CCWIS automated function for which an agency has requested exemption from the CCWIS design requirements at 1355.53(b) and for which ACF has granted an exemption. Exemptions only apply to the specified automated function or functions in the exemption request. Any automated function not exempted by the CCWIS design requirements exemption, or considered automatically exempt, will be considered for review.
Agencies choose which functionality to designate as automated functions within their child welfare information systems. This applies to functionality built in house, implemented by a vendor, reused or repurposed from another agency, or purchased from the open market. All such automated functions, excluding purchased proprietary products owned or maintained by a vendor, such as Commercial-Off-The-Shelf (COTS) products, which are automatically exempt, may qualify for the CCWIS design requirements exemption under 45 CFR 1355.53(b)(2).3 If the agency elects to request exemption for an automated function, the agency must make a business case to ACF in writing that the functionality is more efficient, economical, and effective than is required by the CCWIS design requirements. An exemption will not be granted for legacy software not meeting the CCWIS design requirements.
The agency should clarify in their business case for the exemption a reasonable boundary of the automated function within the CCWIS for which the exemption is requested. For example, consider the scenario of a COTS placement matching tool configured to fit a placement services automated function. ACF would consider a reasonable exemption boundary to be the placement services automated function where the placement matching tool resides.
The exemption of CCWIS design requirements for a COTS, has limits. For instance, ACF will not consider an exemption request for the majority of a CCWIS’s automated functions, due to using minor COTS products throughout.
ACF may review any automated function for compliance with the CCWIS design requirements. For instance, should the implementation of the automated function not match the description provided by the agency in their exemption request, ACF may review the automated function for compliance with design requirements. In another example, should an agency significantly change an automated function exempted from the CCWIS design requirements, ACF may require that the agency resubmit an exemption request, or ACF may review the changed automated function for compliance.
I.C7 Sample Scenarios of Different Implementations that ACF would consider Eligible for Design Review
The following are examples of different CCWIS implementations and determinations of whether ACF will review the CCWIS design. This is not an exhaustive list.
The following implementation scenarios apply to a new CCWIS, or where the agency is building new functionality to an existing child welfare information system that must meet the CCWIS design requirements.
Custom code from the ground up: An agency implements a CCWIS via an information technology vendor or multiple vendors to build the CCWIS without a proprietary software platform as its base. The functionality built that constitutes the CCWIS automated functions is owned by the agency and funded with federal financial participation (FFP). ACF will review the CCWIS automated functions for compliance with the CCWIS design requirements.
Use of a proprietary child welfare information system with customization: An agency implements a CCWIS via an information technology vendor or multiple vendors to build the CCWIS from the starting point of a proprietary software application. Such applications may be available as an on-premises solution, a hosted solution, or a cloud solution, which in turn may be available as a product or as a service. Unless otherwise exempted, ACF will review the automated functions for their compliance with CCWIS design requirements.
Transfer system from another agency: An agency implements a CCWIS via transferring an existing child welfare information system from another agency for its own use. Regardless of the CCWIS compliance status of an application in a different agency, if the application is transferred to another agency, then ACF will review in the new setting.
Adding code on top of a platform: An agency implements a CCWIS via acquiring a platform technology and then building or acquiring software to run on top of that platform technology. Platform technology typically provides a foundation for an agency’s business-related functionality. The code built on top of the platform is the code that ACF would review for compliance with the design requirements. ACF does not certify platforms, systems or modules built by vendors as CCWIS compliant applications. An ACF determination of software conformance to design requirements in one CCWIS implementation does not mean that the software will be CCWIS compliant in any other future implementation.
Configuration with a partially or pre-configured PaaS or SaaS environment (e.g., with vendor “accelerators”): An agency implements a CCWIS via configuration of a platform (and possible building of custom code to support the configuration) while using an “accelerator” as the starting point of configuration. ACF will review the configuration and custom code for their compliance with CCWIS design requirements. The agency may request exemption of the CCWIS design requirements for the “accelerator” functionality and code that is not agency owned.
Configuration without an “accelerator”: An agency implements a CCWIS via configuration of a platform (and possible building of custom code to support the configuration) without using an “accelerator” as the starting point. In this scenario, ACF will review the configuration and the custom code built to support the configuration for compliance with CCWIS design requirements review.
I.C8 Evaluating Conformance with CCWIS Design Requirements: Pilot Methodology and Sample Scoring Sheet
ACF is piloting an evaluation methodology that uses the goals identified in Section B as “conformance indicators” to assess the quality of an agency’s modular design approach for CCWIS implementation and their automated functions. This pilot evaluation methodology is presented below followed by a sample scoring sheet for illustration. The sample scoring sheet walks through a “final rating” calculation that represents conformance or non-conformance of an automated function with CCWIS design requirements. The scoring sheet is adapted from the Department of Defense’s (DoD's) Modular Open Systems Approach (MOSA) Program Assessment and Rating Tool (PART) as the basis of the evaluation method.
This method does not replace a TA functionality review of whether the automated function meets the program and policy needs of the agency. Rather, a conformance to design requirements evaluation may be augmented to the CCWIS TA review process that will determine whether a CCWIS automated function does what the agency needs it to do. The automated function may be reviewed for its adherence to CCWIS design requirements either independent of, or in coordination with the CCWIS TA functionality review.
Pilot Weighted Methodology for Evaluating Conformance to Design Requirements
Under this pilot methodology, during a TA review, ACF will assess each conformance indicator and assign it a score based on its conformance to design requirements. The conformance to design requirements score will fall on a spectrum, ranging from non-compliance to exemplary implementation. A score that indicates an unacceptable level of conformance will require the project to take corrective measures to achieve conformance. During a CAR, if ACF finds an agency has an unacceptable level of conformance ACF may designate the agency’s system a non-CCWIS per federal regulations at 45 CFR 1355.55.
During a TA design review, ACF will assess each conformance indicator in “Category 1” through “Category 4.” Categories 1 – 4 correspond to the groups of conformance indicators in Sections I.B1– I.B4 of this document, respectively.
Both conformance indictors and Categories are assigned pre-defined weights by ACF to represent relative level of compliance priority. Some factors assessed during a TA review, such as conformance to modularity in design at 1355.53(a)(1), may be problematic and considered hard failures. These will cause the ACF to determine that the module does not comply with the CCWIS design requirements. Other factors, such as lacking plain language documentation as defined at 1355.53(a)(2), is not considered a hard failure, but will affect the scoring of the automated function’s level of conformance to the CCWIS design requirements.
ACF will assign a score of 0 -3 to each indicator in each Category based on their level of conformance to CCWIS design requirements:
None (0)
Little Extent (1)
Moderate (2)
Large Extent (3)
Aggregated conformance indicator scores will be calculated for each Category. The Category Scores are used to calculate a Final Rating that represents the automated function’s overall level of conformance with CCWIS design requirements. At the end of the calculated Final Rating, the level of conformance is based on a scale:
Final Rating Scale:
Unsatisfactory (< 50%)
Needs Work (51%-71%)
Satisfactory (72%-80%)
Exemplary (> 80%)
A Final Rating below 72% indicates an unacceptable level of conformance that may necessitate the agency take corrective measures to achieve conformance with CCWIS requirements.
Calculation
Pre-defined weights assigned to each conformance indicator:
Not applicable/not available (0) and does not affect the final conformance rating calculation
Low (1)
Medium (2)
High (3)
Pre-defined weights assigned to each Category:
Category 1: 1355.53(a)(1) Modular Design Requirements – 30%
Category 2: 1355.53(a)(2) Plain Language Requirements – 15%
Category 3: 1355.53(a)(3) Design and development Standards Requirements – 25%
Category 4: 1355.53(a)(4) Share, Leverage, Reuse Requirements – 30%
Step 1 – Calculate Category 1 Score – Multiply each conformance indicator’s score (assessment rating) in this category by its assigned weight to calculate the weighted assessment score for each indicator. Sum the weighted assessment scores for Total Assessment Score. Multiply the total assessed by 3 (highest assessment rating possible) for the Maximum Possible Score. Divide the Total Assessment Score by the Maximum Possible Score for the total Category Score.
Category 1 Score = Total Assessment Score/Maximum Possible Score
Step 2 – Calculate Category 2 Score - Repeat Step 1 for Category 2.
Step 3 – Calculate Category 3 Score - Repeat Step 1 for Category 3.
Step 4 – Calculate Category 4 Score - Repeat Step 1 for Category 4.
Step 5 –Calculate the Weighted Category Scores – Multiply each Category Score from Steps 1 – 4 by its ACF-assigned weight for its Weighted Category Score:
Weighted Category 1 Score = Category 1 Score X .30
Weighted Category 2 Score = Category 2 Score X .15
Weighted Category 3 Score = Category 3 Score X .25
Weighted Category 4 Score = Category 4 Score X .30
Step 6 – Calculate the Final Rating – The Overall Weighted Score is all four Weighted Category Scores. It represents the automated function’s percent level of conformance to design requirements. The Final Rating is measured against the Final Rating Scale to determine if the automated function complies with CCWIS design requirements.
Sample Scoring Sheet
This sample scoring sheet may be used to record assessment scores, and calculate a final rating of conformance. The example below is for illustrating the calculation. Only a sample of conformance indicators have been used from each Category from Sections I.B1 – I.B4 of this document for this calculation.
In this scenario, during a CCWIS design review, a reviewer has assigned scores to each conformance indicator. These scores appear in the “Assessment” columns in Tables C-1 through C-4. Tables C-1 through C-4 represent each Category, respectively.
Finally, Table C-5 tabulates these scores for the calculation of a Final Rating from Step 6 above.
(Note: Conformance indicators used in this example may not reflect the most up-to-date indicators discussed in this draft self-assessment tool. For a complete list of conformance indicators that will be used in TA activities and reviews, refer to Section B of this document.)
Table C-1: CCWIS Design Requirement Goals (or Conformance Indicators) for Modular Design
Category 1
|
Conformance Indicators for 1355.53(a)(1) |
ACF-Assigned
Weight |
Assessment |
Assessment Score (Weight x Assessment) |
Maximum Possible Score (Weight x 3) |
Assessment Guidelines for Reviewer |
Coupling |
The automated function has been designed with clear boundaries. |
3 |
1 |
3 |
9 |
0 = modules are not distinctly separated, lack clear responsibilities, and require an understanding of other modules.
|
Coupling |
The automated function does not require other automated functions to perform its tasks. |
1 |
2 |
2 |
3 |
0
= modules cannot function independently; dependencies not
specified 1,2
= some dependency on other modules 3 = autonomous, independent modules with explicit external dependencies |
Coupling |
The automated function efficiently communicates with other automated functions within the CCWIS. |
3 |
2 |
6 |
9 |
0
= unstructured and unmanaged communication interfaces 1,2
= some inconsistent or inefficient communication paths 3 = clear and effective interfaces between components |
Cohesion |
The identified automated function reflects a discrete, easily defined purpose. The automated function performs a single action or set of actions to meet an objective. |
2 |
1 |
2 |
6 |
0
= modules lack a well-defined purpose or incorporate unrelated,
disparate business functionality 1,2
= modules may include some unrelated functionality 3 = modules have a clear purpose and set of functions to support that purpose |
Cohesion |
The automated function’s functionality does not significantly overlap with any other automated function within the CCWIS. |
2 |
1 |
2 |
6 |
0
= duplicative functionality between modules. 1,2
= some overlapping functionality between modules 3 = modules have clearly separated sets of functionality |
Total Assessment Score/ Maximum Possible Score |
|
|
|
15 |
33 |
|
Step 1: Category 1 Score = Total Assessment Score/Maximum Possible Score = 15/33 = 0.455
Table C-2: CCWIS Design Requirement Goals (or Conformance Indicators) for Plain Language
Category 2
|
Conformance Indicators for 1355.53(a)(2) |
ACF-Assigned Weight |
Assessment |
Assessment Score (Weight x Assessment) |
MPS (Weight x 3) |
Assessment Guidelines for Reviewer |
Know your audience |
The topic is written with a familiarity to the audience, and defining why they need this document. |
2 |
3 |
6 |
6 |
0
= audience not well understood or topic not written for all
relevant
audiences
1,2
= some audiences not addressed 3 = Audiences appropriately identified and effectively addressed |
Organize your thoughts |
The document is organized to provide clear and concise points. |
2 |
3 |
6 |
6 |
0
= document not clearly organized 1,2
= some parts of document need additional organization 3 = document well organized; thoughts clearly and concisely communicated |
Summarize main points |
The document uses headers and lists to summarize main points. |
2 |
2 |
4 |
6 |
0
= Ineffective use of headers and lists 1,2
= some headers and lists may need restructuring 3 = headers and lists effectively summarize and communicate points |
Write short sentences and paragraphs |
The document comprises concise sentences. The document provides an initial context for the ideas discussed, and incorporates definitions into the text. The paragraphs are simple with one topic sentence and one idea developed throughout the paragraph. |
2 |
3 |
6 |
6 |
0
= document sentences and paragraphs poorly constructed 1,2
= some sentences and paragraphs may need editing for effective
communication of ideas 3 = document sentences and paragraphs constructed effectively for clear and concise communication of ideas |
Use every day phrases and words |
The document speaks to the audience (at all levels of expertise) and does not use extraneous words in the document construction. |
2 |
3 |
6 |
6 |
0
= document is overly verbose and uses uncommon language 1,2
= document may need editing to eliminate some verbosity 3 = Document is concise and uses common words and phrases |
TOTAL |
|
|
|
28 |
30 |
|
Step 2: Category 2 Score = Total Assessment Score/Maximum Possible Score = 28/30 = 0.933
Table C-3: CCWIS Design Requirement Goals (or Conformance Indicators) for Design and Development Standards
Category 3
|
Conformance Indicators for 1355.53(a)(3) |
ACF-Assigned Weight |
Assessment |
Assessment Score (Weight x Assessment) |
MPS (Weight x 3) |
Assessment Guidelines for Reviewer |
Adherence to Standards |
The agency developed and conducted a process for evaluating adherence to design and development standards. |
3 |
2 |
6 |
9 |
0
= no developed process for evaluating adherence to standards 1,2
= partially established and/or conducted evaluation process 3 = evaluated adherence to standards |
Adherence to Standards |
Adherence to Standards: The agency acquired QM or IV&V services to monitor the project during development. |
0 (not included in calculation) |
N/A |
N/A |
N/A |
0
= no QM or IV&V services acquired 3 = QM or IV&V services acquired |
Adherence to Standards |
Adherence to Standards: The agency adheres to its design and development standards. |
3 |
3 |
9 |
9 |
0
= no evaluation, or little to no adherence to standards 1,2
= inconsistent adherence to standards 3 = consistent adherence to standards |
Adherence to Standards |
Adherence to Standards: The agency trains staff on standards used and where they can be found. |
2 |
3 |
6 |
6 |
0
= no evidence of training 1,2
= inconsistent or incomplete training 3 = consistent and effective training |
Adherence to Standards |
Adherence to Standards: The agency performs code reviews to determine the quality of the code produced. |
3 |
2 |
6 |
9 |
0
= no code reviews 1,2
= irregular or inconsistent code review process 3 = consistent and organized code review process |
TOTAL |
|
|
|
27 |
33 |
|
Step 3: Category 3 Score = Total Assessment Score/Maximum Possible Score = 27/33 = 0.818.
Table C-4: CCWIS Design Requirement Goals (or Conformance Indicators) for Sharing, Leveraging and Reusing Automated Functions
Category 4
|
Conformance Indicators for 1355.53(a)(4) |
ACF-Assigned Weight |
Assessment |
Assessment Score (Weight x Assessment) |
MPS (Weight x 3) |
Assessment Guidelines for Reviewer |
Share: Included metadata |
Automated function is easily identifiable via a unique name, which does not conflict with an existing project and does not infringe on trademarks. |
3 |
3 |
9 |
9 |
0
= automated function not clearly and uniquely identified 1,2
= name of automated function may conflict with or be confused with
that of another project 3 = automated function clearly and uniquely identified |
Leverage: Clear requirements documentation |
Automated function comes with comprehensive requirements documentation. |
2 |
2 |
4 |
6 |
0
= no requirements information or similar documentation 1,2
= some requirements information provided 3 = comprehensive requirements documentation included |
Leverage: Security and compliance |
Automated function comes with reports describing the results of performed vulnerability testing. |
1 |
1 |
1 |
3 |
0
= no evidence of vulnerability testing. 1,2
= limited vulnerability-testing information available. 3 = extensive vulnerability test information available. |
Reuse: Frameworks |
Automated function is architected to leverage established software frameworks. |
3 |
1 |
3 |
9 |
0
= no use of established software frameworks 1,2
= some use of frameworks 3 = effective and appropriate use of established frameworks |
Reuse: Design patterns |
Automated function leverages underlying design patterns. |
3 |
2 |
6 |
9 |
0
= no identifiable design patterns, and anti-patterns 1,2
= ineffective use of design patterns 3 = effective and appropriate use of established design patterns |
TOTAL |
|
|
|
23 |
36 |
|
Step 4: Category 4 Score = Total Assessment Score/Maximum Possible Score = 23/36 = 0.639
Table C-5: Final Rating
Category |
Category Score (Column 1) |
ACF-Assigned Category Weight (Column 2) |
Column 3 (Column 1 x Column 2) |
1 |
.455 |
.30 |
.137 |
2 |
.933 |
.15 |
.140 |
3 |
.818 |
.25 |
.205 |
4 |
.639 |
.30 |
.192 |
Overall Weighted Score: |
|
|
.674 |
Step 5: Weighted Category Scores - See Table C-5 Column 3.
Step 6: Final Rating = Overall Weighted Score =.674 = 67%.
Final Rating Scale:
Unsatisfactory (< 50%)
Needs Work (51%-71%)
Satisfactory (72%-80%)
Exemplary (> 80%)
Based on the Final Rating Scale, the automated function compliance with CCWIS design requirements is “Needs Work.”
National Institute of Standards and Technology
ISO/IEC 26514:2008 Systems and software engineering — Requirements for designers and developers of user documentation
https://www.iso.org/standard/43073.html
26514-2010 - IEEE Standard for Adoption of ISA/IEC 26514:2008 Systems and Software Engineering--Requirements for Designers and Developers of User Documentation
https://ieeexplore.ieee.org/document/5712775
Guidance on the Requirements for Documented Information of ISO 9001:2015
https://www.iso.org/files/live/sites/isoorg/files/archive/pdf/en/documented_information.pdf
Plain Language Guidelines
“Why Use Plain Language: 10 Steps to Plain Writing,” U.S. Department of Census, https://www.census.gov/content/dam/Census/about/about-the-bureau/policies_and_notices/10_simple_steps.pdf
Department of Defense’s (DoD's) Modular Open Systems Approach (MOSA) Program Assessment and Rating Tool (PART)
https://www.dau.edu/cop/mosa/Lists/Tools/DispForm.aspx?ID=2
Modularity: A better approach to Enterprise IT system modernization in the public sector" Pradeep Goel, CEO, EngagePoint; “Demystifying Modularity – What does a modular MMIS look like?” August 2016, CNSI
1 See CCWIS Technical Bulletin #1: Identifying and Reporting CCWIS Automated Functions. https://www.acf.hhs.gov/sites/default/files/cb/ccwis_tb1.pdf
2 ACF endorses no standard. Standards or standards bodies mentioned in this document are for example only. ACF lists potentially useful standards bodies and other sources on its website at https://www.acf.hhs.gov/cb/research-data-technology/state-tribal-info-systems/resources
3 See Child Welfare Policy Manual Section 6.12A Question #1
PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN: Through this information collection, the Administration for Children and Families (ACF) is collecting information to document that title IV-E agencies have planned and developed their system’s conformity to federal CCWIS and Advance Planning Document requirements. Public reporting burden for this collection of information is estimated to average 24 hours per title IV-E agency choosing to develop and implement a CCWIS system, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This is a voluntary collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0568 and the expiration date is 04/30/2024.
Page
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mozer, Nick (ACF) (CTR) |
File Modified | 0000-00-00 |
File Created | 2024-07-28 |