Download:
pdf |
pdf
U.S. Department of Education
Investment Acquisition Management Team (IAMT)
Information Technology
Operational Analysis Guide
U.S. Department of Education
Operational Analysis Guide
TABLE OF CONTENTS
1
Introduction & Purpose ................................................................................................................ 1
1.1
Operational Analysis (OA) Overview ............................................................................................. 3
1.2
Value to the Investment and Mission ........................................................................................... 3
1.3
Office of Management and Budget Guidance .............................................................................. 4
1.4
Responsibilities ............................................................................................................................. 4
1.5
Frequency ...................................................................................................................................... 5
1.6
Process Flow .................................................................................................................................. 6
2
OA Planning ................................................................................................................................. 7
3
Documentation and Reporting ..................................................................................................... 8
4
3.1
Format ........................................................................................................................................... 8
3.2
Evaluation ..................................................................................................................................... 8
Methodology ................................................................................................................................ 9
4.1
Post‐Implementation Review (PIR) ............................................................................................. 10
4.2
Analysis of Alternatives ............................................................................................................... 10
4.3
Customer/Stakeholder Satisfaction ............................................................................................ 10
4.3.1
4.4
Strategic & Business Results ....................................................................................................... 11
4.4.1
OMB e‐Government/Line of Business Initiatives ................................................................ 11
4.4.2
Data Sources ....................................................................................................................... 12
4.5
Financial Performance ................................................................................................................ 12
4.5.1
Data Sources ....................................................................................................................... 13
4.5.2
Current Cost Baseline .......................................................................................................... 13
4.6
Innovation ................................................................................................................................... 13
4.6.1
5
Data Sources ....................................................................................................................... 11
Data Sources ....................................................................................................................... 14
4.7
Risk Assessment .......................................................................................................................... 14
4.8
Technical Performance Analysis ................................................................................................. 14
Results ....................................................................................................................................... 16
5.1
Performance Variances ............................................................................................................... 16
5.1.1
5.2
Mission/Performance/Satisfaction Gap Analysis ................................................................ 16
Holistic Enterprise View .............................................................................................................. 17
i
U.S. Department of Education
Operational Analysis Guide
6
Recommendations ..................................................................................................................... 18
7
Plan of Action and Milestones (POAM) ....................................................................................... 18
Appendix A: Roles & Responsibilities ................................................................................................. 19
Appendix B: OA Templates ................................................................................................................. 21
Appendix C: Glossary of Acronyms and Abbreviations ........................................................................ 22
Appendix D: Glossary of Terms .......................................................................................................... 23
LIST OF KEY TABLES AND FIGURES
Table 1: Data Sources .................................................................................................................................... 9
Table 2: Financial Performance Categories ................................................................................................ 13
Figure 1: OA Place in ITIM Schema ............................................................................................................... 2
Figure 2: OA Process Flow ............................................................................................................................. 6
LIST OF ATTACHMENTS
Attachment 1 ‐ Operational Analysis Report Template
Attachment 2 ‐ Customer Survey Checklist Template
ii
U.S. Department of Education
Operational Analysis Guide
DOCUMENT VERSION HISTORY
Version
Date
Update By:
Changes
Version 0.1
February, 2010
eGlobalTech
1st Review Draft
Version 0.2
October, 2010
eGlobalTech
2nd Review Draft
Version 0.3
December, 2010
eGlobalTech
3rd Review Draft
Amended to make the PM’s OA
Plan the primary source for the
OA Report
Added “Increasing O&M Costs” to
examples of causes for gaps in
table 3
Version 0.3
February, 2011
eGlobalTech
4th Review Draft
Amended to remove the PM’s OA
Plan and move necessary
information into the OA Report
Version 0.4
April 5, 2011
eGlobalTech
Incorporated comments from
COR
iii
U.S. Department of Education
Operational Analysis Guide
1 INTRODUCTION & PURPOSE
This Operational Analysis (OA) Guide provides Segment Owners1 and project managers (PMs) of
major information technology investments2 with a functional methodology for performing an
Operational Analysis. It also provides a description of what information and data elements
should be captured and assessed before, during, and after the analysis. This guide is based on
Federal guidance, industry best practices, and Department of Education information resource
management (IRM) protocols.
While the majority of investment control activities center on meeting the project’s cost and
schedule goals during development, the development stage represents only a fraction of the
project’s total life‐cycle duration and costs. Ownership costs incurred during the remainder of
the investment’s useful life can easily consume as much as 80 percent of total life‐cycle costs.3
The purpose of the OA is to monitor and evaluate investments in the mixed or steady‐state
phase to ensure that they are continuing to meet cost, risk and value expectations. The OA and
its reporting structure are designed to definitively assess and articulate whether an investment
is meeting its prescribed objectives, be they operational, organizational, business or technical.
The ultimate goal is efficiency and effectiveness, to make the right investment in the right way.
This guide is designed to provide users with a clear reference for planning, conducting, and
using an operational analysis.
•
Section 1 – Introduction and Purpose: describes the OA, its inherent value, relevant
OMB guidance, and subsequent responsibilities.
•
Section 2 – OA Planning: discusses the procedural elements that should be completed or
in progress prior to conducting the analysis.
•
Section 3 – Documentation and Reporting: gives a description of source documents
(inputs & outputs and format & structure) required for conducting the OA, and
proposed resulting documents which should exist upon the conclusion of the analysis.
• Section 4 – Methodology: describes the requisite OA data elements and their sources,
how to capture these elements, how these elements should be used to make a
definitive assessment, and a structure for creating a repeatable process.
1
The term Segment Owner is used at the Department of Education to describe the senior stakeholder and advocate
for a particular line of business (i.e. “segment”) of the Department. The Department’s enterprise architecture
presently consists of 13 segments (for example, grants, loans, research, etc.), each with an “owner” designated by
the Department’s Planning and Investment Review Working Group to articulate the needs, goals, objectives, and
plans of a particular segment. For more on Segment Owners consult page 12 of the Department of Education IT
Investment Management Process Guide, December 2009 on ConnectEd.
2
A “major information technology investment” is defined on page 18 of the above‐mentioned IT Investment
Management Process Guide.
3
OMB Capital Programming Guide Version 2.0, Supplement to OMB Circular A‐11, Part 3: Planning, Budgeting, and
Acquisition of Capital Assets, June 2006. See p.53: http://www.whitehouse.gov/
1
U.S. Department of Education
Operational Analysis Guide
•
Section 5 – Results: outlines how the OA is used in the discovery of investment‐specific
performance gaps and as a mechanism for corrective action.
•
Section 6 – Recommendations: provides information on how the OA team4 should
structure its findings to develop solutions for minimizing or eliminating gaps.
•
Section 7 – Plan of Action and Milestones (POAM): provides information for the OA
team and its analyst to work with the Segment Owner and/or PM to develop a plan
moving forward that will lead to compliance, program fulfillment, and customer
satisfaction.
There are two templates that should be used in performing and evaluating an OA.
•
Attachment 1: Operational Analysis Report Template
•
Attachment 2: Customer Survey Checklist
Each OA should result in an Operational Analysis Report.
Figure 1 below illustrates the OA’s place in the ITIM schema.
Figure 1: OA Place in ITIM Schema
4
See Appendix A for further details
2
U.S. Department of Education
Operational Analysis Guide
1.1 OPERATIONAL ANALYSIS (OA) OVERVIEW
An OA is a method of examining the current performance of a steady‐state investment and
measuring that performance against an established set of cost, risk and performance
parameters. An OA should trigger considerations of how objectives could be better met, how
costs could be saved, and whether in fact certain functions should continue to be performed.
An OA should demonstrate that you have thoroughly examined the need for the investment,
the investment's performance, and alternative methods of achieving the same results.
OA findings will locate gaps in performance and provide insight into their causes. This analysis
helps the agency to compare one investment against another, to promote outcomes or
realize efficiencies that help to better support ED’s mission. The findings will reveal whether the
investment continues to achieve agency goals or should be replaced.
The OA is a key practice within the Government Accountability Office's (GAO) Information
Technology Investment Management (ITIM) Stage 2 maturity model5, a model adopted and
followed by ED.
1.2 VALUE TO THE INVESTMENT AND MISSION
The OA is designed to measure the effectiveness and efficiency of steady‐state (operations and
maintenance (O&M) life‐cycle phase investments. The analysis will either re‐validate the cost
and performance indicators of the investment, or show a need to find better ways for the asset
to meet its life‐cycle cost and performance goals. Operational performance for a given asset
may be indicated by factors such as:
•
effectiveness
•
reliability
•
productivity
•
maintainability
•
availability
•
security
•
energy efficiency (Note that the December 2009 OMB passback requires agencies to
plan to reduce IT energy consumption by a minimum of 30% by 2012.)
5
See GAO ITIM Guidance at: http://www.gao.gov/
3
U.S. Department of Education
Operational Analysis Guide
1.3 OFFICE OF MANAGEMENT AND BUDGET GUIDANCE
An OA is the Office of Management and Budget (OMB) preferred method of measuring
performance of investments in the steady‐state (O&M) life‐cycle phase. This Control Phase6
process differs from other methods as it takes into account the stability of cost, schedule,
performance and risk of operational investments. In accordance with Circular A‐117, OMB has
determined that federal operational analyses should focus on four core measurement areas:
•
Customer satisfaction
•
Strategic and business results
•
Financial performance
•
Innovation
“For capital investments, the greatest level of operational efficiency occurs at the
asset or project level. To improve the accuracy and efficiency of operational data
collection, whenever possible, an agency should employ an efficient way of
collecting and analyzing operating cost and performance data.” OMB Circular A‐
1238
OMB requires agencies to perform an OA annually on the steady‐state components of each
major IT investment. The results of the most recent OA are used to inform the annual IT
portfolio Select process9, and influence the content of the investment’s business case and
subsequent reporting to OMB in the Exhibit 30010.
OMB addresses the value and use of OA in Part 3 of the Capital Programming Guide.11 OMB
advises agencies to establish a system to measure the performance and cost of an operational
asset against the baseline established in the planning phase. This information will allow agency
resource managers to optimize the performance of capital assets. Additionally, an OA may
indicate the need for a new capital asset.
1.4
RESPONSIBILITIES
The PM is responsible for ensuring that the OA is conducted. Upon completion, the PM is to
place a copy of the OA Report in the business case to which it pertains. The PM may consult
OCIO’s Investment and Acquisition Management Team (IAMT) 12concerning composing an OA
team that ensures an objective analysis. The PM will work with OCIO’s Enterprise Architecture
Program Office (EAPO) to ensure that the investment is assigned the proper Enterprise
6
The Control Phase of the IT investment management process is explained on pp. 26‐32 of the Department of
Education IT Investment Management Process Guide, December 2009.
7
OMB Circular A‐11, Section 300, page 20. See: http://www.whitehouse.gov/
8
OMB Circular A‐123 See: http://www.whitehouse.gov/
9
The Select process is explained on pp. 19‐25 of the IT Investment Management Process Guide on ConnectEd.
10
Learn about the Exhibit 300 on page 41 of the IT Investment Management Process Guide Process Guide on
ConnectEd.
11
Capital Programming Guide Version 2.0, Supplement to OMB Circular A‐11, Part 3: Planning, Budgeting, and
Acquisition of Capital Assets, June 2006. See: http://www.whitehouse.gov/
12
OCIO’s Investment and Acquisition Management Team is referred to hereinafter as the “ITIM team.”
4
U.S. Department of Education
Operational Analysis Guide
Architecture (EA) segment and meets the target architecture (the EAPO can be reached at
EAHelp@ed.gov). OCIO’s IAMT and EAPO are responsible for reviewing the OA for
completeness and for ensuring that it reasonably satisfies its purpose and the requirements of
OMB and ED.
The Segment Owner is responsible for reviewing the OA and initiating any actions to improve
the investment’s performance, including recommending to the Planning and Investment
Review Working Group (PIRWG) changes in scope or discontinuance of the investment.
Using verifiable data contained in the OA, the PIRWG will review IT investment performance
against stated expectations and forward its findings to the CIO for presentation to and ultimate
decisions by the agency Investment Review Board (IRB). The PIRWG will use the investment’s
OA as the basis for this review to help ensure ED conforms to GAO’s ITIM Stage 2 maturity
model.
1.5 FREQUENCY
An OA should be completed and submitted to OCIO/IAMT within a year following the first full
fiscal year that the investment has a steady‐state component. An updated OA should be
completed annually thereafter. A submission earlier in the year allows the findings of the OA to
be considered during the Select Phase process of selection for funding of investments.
5
U.S. Department of Education
Operational Analysis Guide
1.6 PROCESS FLOW
The following flow chart displays the overall process for initiating, preparing, submitting and acting on the findings of
the OA. Not included are the processes at the organizational level between the PM and the Segment Owner. Those
interactions are unique to each investment and should be described in the investment’s OA Plan in sufficient detail that
a new PM would have no difficulty following them.
Figure 2: OA Process Flow
6
U.S. Department of Education
Operational Analysis Guide
2 OA PLANNING
The PM of each steady‐state investment, or mixed investment with operational components,
must ensure the completion of an OA containing consistent objective results derived from
reliable and repeatable procedures for conducting an annual OA. This will provide the
members of the OA Team, who will vary over the life of the investment, with concise directions
in order to avoid incomplete or inconsistent reviews from year to year. The PM will specify
roles and responsibilities, and identify the following elements at a minimum:
•
The investment’s management control process
•
Who will perform your OA?
•
How will you fund the OA?
•
Who are your customers/stakeholders?
•
How will you gather data?
7
U.S. Department of Education
Operational Analysis Guide
3 DOCUMENTATION AND REPORTING
The source data, analytical processes, and findings of the OA should be documented and
retained in the investment library. Documentation should include data source documents
and data collection methods. Data source documentation may include:
•
survey results,
•
system logs,
•
progress reports,
•
work breakdown structure (WBS).
Any actions taken in response to the findings and recommendations of the OA should be
documented for consideration in future OAs or other analysis. This will provide valuable
information for future analyses and traceability for the actions' rationale, is useful when
responding to future audits and helps to demonstrate management attention to weaknesses or
opportunities.
3.1 FORMAT
The report of the OA results is to be prepared and submitted using the Report template found
in Attachment 1 to this guide.
3.2 EVALUATION
The results of the OA Report will be evaluated by the ITIM team (i.e. OCIO/IAMT, the PIRWG,
etc.).
8
U.S. Department of Education
Operational Analysis Guide
4 METHODOLOGY
This section of the guide describes the materials you will use and the ways you will use the data to assess the
investment's management and operational success. The OA will leverage the work accomplished prior to the
IT asset becoming operational. It requires familiarity with alternatives considered prior to implementation,
performance metrics, customer survey information and feedback, post‐implementation review results, and
budget and cost data.
The objective of an OA is to provide a fresh look at the investment in light of changing events. The OA lead
should be someone who can approach the analysis with a degree of impartiality13. A team approach is
encouraged, as it provides the opportunity to include the perspective of the investment customers,
stakeholders and operational team.
Findings and recommendations are to be reported in a standard format using the OA Report template14.
Sources of data in the OA must be reliable and verifiable. The data source should be identified in the plan and
in the final report. Any original data collected should be retained in the investment’s project library for future
reference and analysis.
(Note: Objective quantitative measures are preferred for effectiveness/efficiency, whereas subjective measures
are typically used to assess user and/or customer satisfaction and to elicit potential improvements).
Table 1: Data Sources
Objective Data Sources:
Subjective Data Sources;
Efficiency/ Effectiveness
Customer Satisfaction
Projected Cost Data
Surveys
Actual Expense Data
Customer Focus Groups
Schedule Milestones
User Group Meetings
Technical Performance
Complaints/Suggestions
13
One approach might be to obtain a separate contractor, such as one that conducts independent validation and verification (IV&V)
activities, to perform the operational analysis.
14
Please see Attachment 1 for the Operational Analysis Report template
9
U.S. Department of Education
Operational Analysis Guide
4.1 POST‐IMPLEMENTATION REVIEW (PIR)
The OA should build upon the findings and outcomes of the PIR (if available) that was conducted upon
completion of the development/modernization/enhancement (DME) phase of the investment. The analysis
should examine any findings of the PIR that may be pertinent to the steady‐state phase, or lead to
recommendations for adjusting the focus or management of the investment.
The PIR is a process for tracking and measuring the impact and outcomes of implemented or canceled IT
initiatives. A PIR is the starting point and an input for the Evaluate Phase15 of the CPIC process. The PIR
focuses on verification and validation of six primary areas. These are the same components that the OA
assesses in an on‐going manner throughout the steady‐state phase:
•
impact on goals and strategic objectives
•
impact on stakeholders
•
cost and schedule variances
•
operational performance
•
architectural compliance
•
project risk management.
4.2 ANALYSIS OF ALTERNATIVES
The OA should always include a review of, and make reference to, the current investment analysis of
alternatives. Reviewing the analysis of alternatives will provide insight into options or limitations that may
help to frame the development of the OA. If the findings of the OA appear to be at odds with the analysis of
alternatives, or suggest an alternative not previously considered, the reasons should be clearly stated in the
OA Report.
4.3 CUSTOMER/STAKEHOLDER SATISFACTION
PMs should periodically assess whether the investment continues to support customer processes as designed.
The focus of the customer/stakeholder satisfaction assessment should be to determine how well the
investment meets customer needs and delivers services, and whether it could be improved to better meet
changing requirements. Techniques for measuring customer satisfaction include interviews, investment‐
specific questionnaires, user groups, on‐line feedback, and review of help desk logs. Issues to be addressed
include the degree to which functionality and performance is satisfactory to the customer, whether the
investment is helping users to perform their functions more easily or efficiently, and whether it will continue
to meet customer and stakeholder needs. The method of measuring satisfaction should include a breakdown
of results and analysis by business need.
15
Capital Programming Guide Version 2.0, Supplement to OMB Circular A‐11, Part 3: Planning, Budgeting, and Acquisition of Capital
Assets, June 2006. See: http://www.whitehouse.gov/
10
U.S. Department of Education
Operational Analysis Guide
4.3.1 DATA SOURCES
The PM must establish a strategy to solicit user or customer input. This can be done by a survey, focus groups
or regular user group meetings. The PM must document the schedule and strategy for collecting this data.
PMs should use the OA Report template16 to assist in gathering data.
Based on projected investment benefits, the survey, focus group or regular user group results will determine
whether the investment is meeting its original or revised objectives. The results are to be documented in the
OA Report.
4.4 STRATEGIC & BUSINESS RESULTS
Strategic and business results measure how well the investment is meeting ED’s business needs, whether it is
contributing to the achievement of ED’s current strategic goals17, and its alignment with the enterprise
architecture. An investment's mapping to EDs’ strategic goals is captured in its business case. In this category
of analysis, the OA should provide data that contributes to answering such questions as: What strategic goals
does this investment align with and support, and how does it help us achieve them? Over time, as business
practices evolve and as missions and programs wax or wane, methods, procedures and processes once
considered effective may no longer produce the best return on the dollar.
Performance measurement18 and summary of quantified benefits is an important element of the OA
framework. The project needs to be able to identify, measure, and track the accumulation over time of those
benefits that were cited as justification for funding the investment. Benefit accumulation schedules may be
based on the original quantitative benefit projection in the analysis of alternatives or may be revised in
subsequent projections based on the program’s actual cost, schedule, and technical performance data.
The OA lead should review the most recent Information Resources Management (IRM) Strategic Plan19 to
ensure that the investment’s technology and IRM are supportive of, and do not conflict with, the functional
areas of the IRM goals.
ED has developed an enterprise architecture that describes the agency’s lines of business and the subsequent
technology and process components that support its business functions. Target enterprise architectures
describe ED’s vision for operations in the future, including how all ED systems support specific processes or
lines of business. The program or investment should be reviewed against the target enterprise architecture to
re‐validate the need for the investment or to anticipate changes that might be required of the investment due
to interdependencies with other investments. Before changing the existing investment, consider using
another investment within ED to meet future needs. The EAPO (EAHelp@ed.gov) can provide assistance.
4.4.1 OMB E‐GOVERNMENT/LINE OF BUSINESS INITIATIVES
Discuss any planned alignment or migration to any e‐Government or line‐of‐business solution (if applicable)
and discuss the transition strategies to accomplish alignment.
16
Please see Attachment 1 for the Operational Analysis Report template.
Department of Education Strategic Plan; http://www.ed.gov/
18
For guidance in selecting appropriate performance measures, refer to www.whitehouse.gov/.
19
Department of Education IRM Strategic Plan; http://connected/
17
11
U.S. Department of Education
Operational Analysis Guide
4.4.2 DATA SOURCES
Most performance measures you will use are contained in the investment’s business case. However,
additional customer satisfaction20 and technical performance data will also be important.
4.5 FINANCIAL PERFORMANCE
Financial performance compares current financial performance to the pre‐established (initial) cost baseline,
and reviews the system for cost reasonableness and efficiency. Financial performance should be expressed in
terms of planned expenditures, actual expenditures, and ratios comparing planned to actual expenditures.
The financial performance analysis is the periodic analysis of the current cost performance and the expected
changes to annual estimates going forward. An investment with steadily increasing cost during steady‐state
needs to be balanced against the potential return on that investment.
Describe the method you are using to measure and track cost, schedule, and performance metrics. Describe
the investment’s cost, schedule, and performance baseline, and describe the management technique you are
using to monitor metrics against the baseline (monthly status review meetings, budget reviews, etc). Also
describe the quantitative metrics you are using to measure variances from the baseline, and the frequency
with which you apply these measurements. It could also be helpful in this section to describe any tools you
are using to track performance metrics (Microsoft Project, Excel spreadsheets, etc.).
Discuss the current performance of the investment.
•
Is performance within limits of variance? If not, what corrective actions are you taking to get back on
track?
•
Has upper management concurred in the planned corrective actions?
20
Measurement of customer satisfaction can be attained via surveys (for example, see Attachment 2 for further detail on how to
develop a customer survey).
12
U.S. Department of Education
Operational Analysis Guide
The categories in the table below should be used to describe financial performance.
Table 2: Financial Performance Categories
#
Category
Description
1
Cost and Schedule
Performance Measurement
Describe the method you are using to measure and
track cost, schedule, and performance metrics.
3
Cost and Schedule
Performance Control
Techniques
Describe the management technique you are using
to keep cost and schedule on the performance
baseline.
5
Cost and Schedule
Performance Tools
Describe any tools you are using to track
performance metrics (Microsoft Project, Excel
spreadsheets, etc.).
In addition to the costs of the investment, the financial performance analysis should address the
indicators/metrics used to measure performance in relation to cost saving/avoidance identified in the Exhibit
300. The OA must also address the indicators/metrics used to measure the return‐on‐investment and
payback‐period estimates of the cost/benefit analyses that were part of the investment analysis.
To ensure that the products and services delivered to customers reflect full value for the resources expended,
the investment’s schedule and risk management plan/records, and its financial records, must provide
sufficiently detailed data.
4.5.1 DATA SOURCES
The performance measurement baseline is the source of expected investment costs. Budget data for the
investment should align with IRB‐approved funding for the investment.
4.5.2 CURRENT COST BASELINE
OMB Memorandum 05‐2321 requires federal agencies to establish and validate performance measurement
baselines with clear cost, schedule, and performance goals for “all new major IT projects, ongoing major IT
developmental projects, and high risk projects to better ensure improved execution and performance as well as
promote more effective oversight.” The PIRWG approves an investment’s cost baseline and subsequent
changes. The OA should be developed using the current cost baseline that has been reviewed and approved
by the PIRWG.
4.6 INNOVATION
Opportunities for improving an investment may be found in new technologies or in new work flows, data
flows or other processes for accomplishing the business objective, or a combination of new technologies and
21
OMB Memorandum 05‐23, “Memorandum for Chief Information Officers,” August 4, 2005, http://www.whitehouse.gov/
13
U.S. Department of Education
Operational Analysis Guide
change in the business process. Addressing innovation in the OA is an opportunity to demonstrate that the
PM is in touch with the stakeholders’ fundamental business needs and is monitoring the current state of the
technology and availability in the marketplace of cost‐saving and performance‐enhancing technologies. The
OA is to include a review of the latest Alternatives Analysis and comment on the alternatives considered, in
the context of innovative solutions to the needs of the Segment Owner.
The Department’s Enterprise Architecture and IRM Strategic Plan22 should be reviewed with thought given to
transitioning from ‘stove pipe’ systems to investments that cross lines of business (LoB) where doing so will
increase effectiveness or efficiency. The process will ensure the PM is communicating with investment
customers and stakeholders to address questions such as:
•
How could we deliver more effective service to the customer?
•
Could we meet these same customer needs at lower cost?
•
Could this investment be combined with others to better meet our organization’s strategic goals?
4.6.1 DATA SOURCES
Data that may identify potential benefits of innovation may be found in trade journals, vendor performance
statistics, user surveys/user group meetings, PIR results, market research, business process reengineering
(BPR) studies, oversight reports (such as IG and GAO), or other studies of which the Segment Owner may be
aware.
4.7 RISK ASSESSMENT
Throughout the development and steady‐state phases of an investment, risks are identified and tracked, and
related mitigation activities are monitored. Any significant changes to the risk environment/status or risk‐
mitigation actions need to be analyzed as part of the OA activity to ensure the risks have not become
unacceptable relative to the benefits of the investment at its current baseline. Conversely, a significant
reduction in risk may indicate a benefit in extending the life of the investment.
4.8 TECHNICAL PERFORMANCE ANALYSIS
Regardless of which performance indicators and objectives were reported in the business case and reported to
OMB (Exhibit 300), a number of technical performance indicators/objectives should be monitored as part of
an OA effort to ensure that performance levels are sustained or continue to improve over time.
Technical performance indicators/objectives to monitor should include items such as:
•
Functional performance (how long it takes to perform a function using the system; e.g., process a
claim)
•
Frequency and length of unscheduled outages
•
Maintenance and equipment outages
•
Mean time between outages/failures
•
Mean time to restore service
22
Department of Education Enterprise Architecture and IRM Strategic Plan; http://connectedED/
14
U.S. Department of Education
Operational Analysis Guide
•
Corrective maintenance action labor hours
•
Operational availability
•
Operational productivity measures (e.g., mean time to perform functions)
•
Human or system error rates
•
Training time to proficiency.
15
U.S. Department of Education
Operational Analysis Guide
5 RESULTS
The result of the OA process should be twofold:
1. More effective delivery of services by the investment, and
2. Improved effectiveness of the Department’s IT Portfolio in accomplishing ED’s goals.
Findings should be supported by evidence that is well documented and available for review.
5.1 PERFORMANCE VARIANCES
The findings should highlight the existence, and if possible, the cause of variances in
performance thresholds exceeding 10% over or under budget or 10% under the amount of
work it was intended to enable (See section 4.5 Financial Performance).
5.1.1 MISSION/PERFORMANCE/SATISFACTION GAP ANALYSIS
Address gaps identified between expected and actual results both in terms of technical
performance shortcomings and failure to meet the needs of the customer.
Explore the root causes of any gaps. Identify what, if any, additional functionality or
performance is required. If the investment is already scheduled for replacement or retirement,
name the investment(s) that will support the requirements in the future.
The reported performance variances should be based on information contained in the business
case.
The following table summarizes a few examples of topics for consideration:
Table 3: Topics for Consideration
Cause of Gap or Problem
Poor performance and reliability
Potential Solution
Modernized workstations and frequent technology
refresh
Cannot meet growing demand or Increased capacity to meet processing, service, and
transaction volume
mission demands
Inadequate information and
computer security
Enterprise‐based security authentication and or
control. Re‐code sign‐in to accept strong passwords
Poor customer service
Institute help desk software to accept electronic
service ticket submission and tracking
Technical architecture not
scalable
Re‐engineer system to Web‐based/cloud computing
architecture
System does not address changes Modify software to address new requirements
in legislative requirements
Increasing O&M costs
Consider cloud computing, contract re‐compete, or
systems consolidation. Review alternatives analysis.
16
U.S. Department of Education
Operational Analysis Guide
5.2 HOLISTIC ENTERPRISE VIEW
Step back and look at the investment. How effectively does it achieve mission performance?
Assess its relationship with other investments and their interactions. Consider business and
technology issues. The primary purpose of technology is to improve business performance in
the most cost‐efficient manner. Since government operations and program mission will change
over time, it is important to analyze how well the investment is aligned with changed
requirements. If possible, address the relationship of this investment to other investments
from the perspective of increasing effectiveness or efficiencies in accomplishing ED’s mission.
Areas that might present appropriate opportunities are standardization, consolidation, and
changing technologies. Identify opportunities for the investment to use and foster
standardized data definitions. Consult with the Segment Owner and the Enterprise
Architecture Program Office to find technical and business solutions that compliment, or avoid
overlap with, existing or planned investments.
17
U.S. Department of Education
Operational Analysis Guide
6 RECOMMENDATIONS
Identify solutions that can provide the needed functionality or performance. This may include
designing new processes, implementing technologies compliant with ED’s enterprise
architecture, or collaborating with other initiatives within the federal government.
Recommend whether the existing system should be:
•
continued with no additional investment for DME,
•
altered,
•
terminated, or
•
migrated to a similar system and retired.
The customer satisfaction assessment may have identified the need for capabilities related to
but beyond the scope of the current Investment. Enhancements outside of the existing project
scope are considered a new component of the investment.
If the OA identifies a need for a change in the investment’s scope and/or the cost is impacted,
the PM must prepare and submit a Baseline Change Request23.
7 PLAN OF ACTION AND MILESTONES (POAM)
Describe the actions that are scheduled to be taken in response to the findings of the OA.
Action plans should include a clear description of the desired outcome, identification of
responsible parties, a well defined course of action to be taken, and milestones (with associated
initiation and completion dates) leading to achievement of the objective.
Actions may include drafting a Baseline Change Request or initiating discussions with other
investment managers to investigate investment consolidation. Certain findings may warrant
follow‐on studies or corrective action plans.
23
Department of Education Departmental Directive OCIO:3‐108, Information Technology Investment Management
(ITIM), 1/29/2010
18
U.S. Department of Education
Operational Analysis Guide
APPENDIX A: ROLES & RESPONSIBILITIES
Responsible Party
Responsibilities
Reference
OA Team
Conduct the OA.
The OA team (i.e. those performing the OA)
should consist of individuals that are
independent from the management and
outcomes of the investment.
Sec 1 &2
Project Manager
Develop reliable and repeatable procedures
for conducting an annual OA.
Ensure that the OA is conducted,
documented, and submitted to the ITIM
team.
Work with ITIM team on the OA team
composition to ensure an objective analysis.
Work with EAPO to determine if the
investment meets the target architecture
and is assigned the proper segment
identifier.
Sec. 1& 2
Sec. 1.4
Sec. 1.4
Sec. 1.4
ITIM team and EAPO
Review the OA Report for completeness and
ensure that it reasonably satisfies its
purpose and the requirements of OMB and
ED’s business requirements.
Evaluate the OA report using scoring criteria
found in the OA Evaluation Template.
Sec. 1.4
Sec. 3.3
Segment Owner
Review the OA and initiate any actions
deemed to improve the investment’s
performance. Make recommendations to
the PIRWG to change scope or terminate the
investment.
Sec. 1.4
19
U.S. Department of Education
Operational Analysis Guide
Planning and
Investment Review
Working Group
(PIRWG)
Review the performance of IT investments
against stated expectations.
Sec. 1.4
IRB
Make decisions on OA findings and
recommendations brought to their
attention.
Sec. 1.4
20
U.S. Department of Education
Operational Analysis Guide
APPENDIX B: OA TEMPLATES
•
Attachment 1: DoED Operational Analysis Report Template
•
Attachment 2: DoED Customer Survey Checklist Template
21
U.S. Department of Education
Operational Analysis Guide
APPENDIX C: GLOSSARY OF ACRONYMS AND ABBREVIATIONS
Acronym/Abbreviation
Definition
BCR
Baseline Change Request
BPR
Business Process Reengineering
CIO
Chief Information Officer
CPIC
Capital Planning & Investment Control
DME
Development/Modernization/Enhancement
EA
Enterprise Architecture
EAPO
Enterprise Architecture Program Office
ED
Department of Education
FY
Fiscal Year
GAO
Government Accountability Office
IAMT
Investment & Acquisition Management Team
IG
Inspector General
IRB
Investment Review Board
IRM
Information Resources Management
IT
Information Technology
ITIM
Information Technology Investment Management
LoB
Line of Business
OA
Operational Analysis
OCIO
Office of the Chief Information Officer
O&M
Operations & Maintenance
OMB
Office of Management and Budget
PIR
Post‐Implementation Review
PM
Project Manager
PIRWG
Planning and Investment Review Working Group
ROI
Return on Investment
22
U.S. Department of Education
Operational Analysis Guide
APPENDIX D: GLOSSARY OF TERMS
Terms
Alternatives
Alternatives Analysis
Baseline
Benefit/cost ratio
Business Case
Capital Assets
Cloud Computing
Cost Avoidance
Definition
The different courses of action, means, or methods by which
objectives may be attained.
An analysis which considers the alternatives available for
pursuing a business objective. Sometimes included as part of the
feasibility study.
A term used, in the context of an Operational Analysis, to
describe (1) use of status quo costs and benefits as a basis for
developing costs and benefits for alternatives during the
cost/benefit analysis and, more importantly, (2) use of costs and
benefits projected for the selected alternative during the
cost/benefit analysis as a basis for comparing actual costs and
benefits during cost/benefit measurement.
An economic indicator of cost‐effectiveness computed by
dividing present value benefits by present value costs. Indicates
the amount of benefits returned for each dollar invested.
A structured proposal to make an investment, which functions as
a decision package for organizational decision‐makers. A
business case may include an analysis of business process
performance and associated needs or problems, proposed
alternative solutions, assumptions, constraints, and risk‐adjusted
cost‐benefit analysis (CBA), as appropriate to the investment. For
major investments, OMB Exhibit 300 serves as the business case.
Assets that are composed of land, structures, equipment, and
intellectual property (including software) that are not acquired
for the purpose of consumption or resale.
A model for enabling convenient, on‐demand network access to
a shared pool of configurable computing resources (e.g.,
networks, servers, storage, applications, and services) that can
be rapidly provisioned and released with minimal management
effort or service provider interaction.
Benefits realized by avoiding a relatively certain future
expenditure, although the projected expenditure has not been
budgeted or obligated. Cost avoidance is more speculative than
cost savings and requires more rigorous justification.
23
U.S. Department of Education
Operational Analysis Guide
Terms
Cost/Benefit Analysis
Customer
Development/Modernization/
Enhancement (DME)
Earned Value Analysis
Effectiveness
Efficiency
Investment
Life Cycle
Life Cycle Cost
Line of Business
Definition
Detailed evaluation of the costs and benefits of selected
alternatives identified during the alternatives analysis. Includes
costs of current and projected operations as a baseline for (1)
determining which alternative to select for automation and (2)
measuring costs and benefits of the implemented and
operational system over time. Costs are normally expressed in
dollars, but benefits may be expressed in dollars or in other
quantitative (such as time reduction) or qualitative (such as
improved security) measures. Cost/benefit analysis determines
the most cost‐effective solution, not simply the least cost
solution. Can be included as part of the feasibility study or
alternatives analysis, or stand as a separate document.
Groups or individuals who have a business relationship with the
organization; those who receive or use or are directly affected by
the products and services of the organization.
DME is the portion of an IT investment/project that deals with
developing and implementing new or enhanced capability. IT
investments may include DME and “steady state” (see Steady
State below) components.
A project management tool/process that evaluates scope,
schedule and cost to produce an objective, quantifiable, time‐
based measure of an investment’s progress and performance.
A project’s ability to meet requirements at the project and
agency level.
Means execution of project goals with most effect and with
minimum/reasonable use of resources.
An expenditure of funds to acquire a new, or continue an
existing, capability, function or asset.
The time from conception to disposal of an investment,
encompassing the Select, Control and Evaluate Phases.
The total cost of acquisition and ownership of a system over its
full life, including the cost of planning, development, acquisition,
operation, support, and disposal.
Line of Business (LoB) initiatives are by definition multi‐agency
efforts. Due to the multi‐agency impact, multi‐agency
collaboration investments such as E‐Gov and LoB initiatives are
also by definition Major Investments (OMB Circ. A‐11, Sec. 300).
24
U.S. Department of Education
Operational Analysis Guide
Terms
Major Investment
Maturity Model
Mixed Life Cycle
Objectives
Performance Measure
Performance Measurement
Performance Measurement
Baseline
Post‐Implementation Review
Definition
An investment requiring special management attention because
of its importance to the mission or function of the agency; or for
financial management which obligates more than $500,000
annually; or has significant program or policy implications; high
executive visibility; high development, operating, or
maintenance costs; is funded through other than direct
appropriations; or is defined as major by the agency’s capital
planning and investment control process. Investments not
considered "major" are "non‐major."
Models of the stages through which organizations progress as
they define, implement, evolve, and improve their processes.
This model serves as a guide for selecting process improvement
strategies by facilitating the determination of current process
capabilities and the identification of the issues that are most
critical to achieving quality and process improvement.
A life cycle including both steady state and development,
modernization, enhancement (DME) aspects.
Goals, results, or program improvements that the decision‐
maker wants to attain. Objectives should be independent of the
solution and stated in a manner that does not preclude
alternative approaches.
Indicators, statistics, or metrics used to gauge program
performance (OMB Circ. A‐11 Sec 200.3). A method used to
determine the success of an initiative by assessing the
investment contribution to predetermined strategic goals.
Measures are quantitative (e.g., staff‐hours saved, dollars saved,
reduction in errors, etc.) or qualitative (e.g., quality of life,
customer satisfaction, etc.). For IT investments, a set of
performance measures are reported in the “Performance
Information Table” in OMB Exhibit 300.
A means of evaluating efficiency, effectiveness, and results.
Performance measurement should include program
accomplishments in terms of outputs (quantity of products or
services provided) and outcomes (results of providing outputs in
terms of effectively meeting intended agency mission
objectives). Indicators, statistics or metrics used to gauge
program performance. (OMB Circ. A‐11, Part 7)
The time‐phased budget plan against which investment
performance is measured.
An assessment and review of a project’s operational, working
solution to determine whether the targeted outcome of the
investment has been achieved.
25
U.S. Department of Education
Operational Analysis Guide
Terms
Return on Investment
Risk Management Plan
Schedule Variance
Segment Owner
Stakeholder
Steady state
Work Breakdown Structure
Definition
Project benefits in relation to costs while taking into
consideration integrity, confidentiality and authenticity,
availability and reliability.
A description of potential cost, schedule, and performance risks,
and an approach to managing all potential risks.
Earned value minus the planned budget for the completed work.
A senior manager ultimately responsible for supporting an
enterprise architecture segment and the investments contained
within it.
An individual or group with an interest in the success of an
organization in delivering intended results and maintaining the
viability of its products and services. Stakeholders influence
programs, products, and services.
The maintenance or operational stage of an investment’s life
cycle.
A tool used to define and group a project's discrete work tasks in
a way that helps organize and define the total work scope of the
project.
26
ATTACHMENT 1
U.S. Department of Education
Investment Acquisition Management Team (IAMT)
Operational Analysis Report
Enter Investment Name
1
Department of Education
OPERATIONAL ANALYSIS REPORT
Table of Contents
1
Introduction and Executive Summary ........................................................................................... 4
1.1
Analysis Scope ........................................................................................................................................... 4
1.2
Executive Summary ................................................................................................................................... 5
2
1.2.1
Customer/Stakeholder Satisfaction ................................................................................................... 6
1.2.2
Financial and Schedule Performance ................................................................................................ 6
1.2.3
Innovation .......................................................................................................................................... 7
OA Planning .................................................................................................................................. 7
2.1
Scope of the OA ......................................................................................................................................... 7
2.2
Analysis Assumptions and Objectivity ...................................................................................................... 8
3
4
Documentation and Reporting ...................................................................................................................... 8
Methodology ................................................................................................................................ 9
4.1
Customers and Stakeholders ..................................................................................................................... 9
4.2
Customer Satisfaction Goals and Metrics ............................................................................................... 10
4.2.1
Customer Data Collection and Analysis Methods, Procedures, and/or Tools ................................ 10
4.2.2
Customer Assessment Gap Analysis ................................................................................................ 11
4.3
Strategic and Business Performance Gap Analysis .................................................................................. 11
4.4
Financial Performance Goals and Metrics ............................................................................................... 12
4.4.1
Financial Performance Data Collection and Analysis Methods ....................................................... 13
4.4.2
Cost Variance Analysis Results ....................................................................................................... 13
4.4.3
Financial Performance Results ........................................................................................................ 14
4.4.4
Financial Performance Gap Analysis................................................................................................ 15
4.5
5
6
Risk Assessment ....................................................................................................................................... 15
Technical Metrics and Results ..................................................................................................... 16
5.1.1
Technical Performance Goals and Metrics ...................................................................................... 16
5.1.2
Technical Performance Gap Analysis ............................................................................................... 17
Recommendations ...................................................................................................................... 17
2
Department of Education
OPERATIONAL ANALYSIS REPORT
7
Action Plan ................................................................................................................................. 18
3
Department of Education
OPERATIONAL ANALYSIS REPORT
Section 1 is to be completed by the Project Manager. The remaining sections are to be completed by
the person(s) responsible for conducting the Operational Analysis.
1 INTRODUCTION AND EXECUTIVE SUMMARY
Investment Name
Unique Project Identifier
Investment Status
Steady State: Mixed Lifecycle:
Operational Component Name(s)
(if Mixed Lifecycle)
Principal Office
Date of Operational Analysis
Project Manager
Date Submitted to IAMT
Revision Number
Revision Date
1.1 ANALYSIS SCOPE
Enterprise Architecture Segment
Brief description of Investment scope
Brief list of scope constraints (if any)
4
Department of Education
OPERATIONAL ANALYSIS REPORT
1.2 EXECUTIVE SUMMARY
Provide a brief summary of the findings of the OA here. Include a short description of the results of the gap
analysis (see the following sections in the OA report: 4.2.2, 4.3, 4.4.4, and 5.1.2).
USER RESPONSE HERE :
5
Department of Education
OPERATIONAL ANALYSIS REPORT
1.2.1
CUSTOMER/STAKEHOLDER SATISFACTION
Describe strategic and business results (e.g. review the performance measurement information contained in
the “Performance Information” section of the Business Case) and evaluate if the actual measurement values
(results) represent expected progress at this stage of the investment’s lifecycle. Briefly describe any
performance targets in the business case that are not currently being met or are expected to not be met in the
future.
USER RESPONSE HERE:
1.2.2
FINANCIAL AND SCHEDULE PERFORMANCE
Describe financial and schedule performance (for example, by comparing actual vs. planned expenditures).
Briefly describe any schedule variance that will cause a planned completion date of a milestone to be missed or
impact a task on the critical path. Briefly describe any cost variance greater than 5%.
USER RESPONSE HERE:
6
Department of Education
OPERATIONAL ANALYSIS REPORT
1.2.3
INNOVATION
Describe any innovation associated with this investment (for example, the potential use of cloud computing as
a technical solution, or re‐engineering of this investment resulting in consolidation with another investment).
USER RESPONSE HERE:
2 OA PLANNING
(See Section 2, “OA Planning” in the OA Guide for further reference)
2.1 SCOPE OF THE OA
Describe the investment or component that is the subject of this OA and provide a description of the business
processes that the investment supports. If an investment component is being analyzed, explain how this
particular component maps to the milestones in the current PIRWG‐approved baseline.
USER RESPONSE HERE:
7
Department of Education
OPERATIONAL ANALYSIS REPORT
2.2 ANALYSIS ASSUMPTIONS AND OBJECTIVITY
Provide a list of assumptions that were made relating to the analysis.
Explain how the objectivity and impartiality of the analysis will be maintained.
USER RESPONSE HERE:
3 DOCUMENTATION AND REPORTING
Provide a list of the data sources used to conduct this analysis
(See Section 3, “Documentation and Reporting” in the OA Guide for further reference)
USER RESPONSE HERE:
8
Department of Education
OPERATIONAL ANALYSIS REPORT
4 METHODOLOGY
The analysis for this section focuses on five performance areas: Customer Satisfaction; Strategic and Business
Performance; Financial Performance; Risk; and Technical Performance.
(See section 4, “Methodology” in the OA Guide for further reference)
4.1 CUSTOMERS AND STAKEHOLDERS
Briefly describe the investment's customers and stakeholders. Customers are the people and organizations
who receive or use or are directly affected by the products and services of the investment. Stakeholders may
include customers and others who do not directly use or benefit from the investment but have a vested
interest in its success Briefly state how each component affects each category of customer.
USER RESPONSE HERE :
9
Department of Education
OPERATIONAL ANALYSIS REPORT
4.2 CUSTOMER SATISFACTION GOALS AND METRICS
Summarize performance and satisfaction goals, including related business case performance metrics/indicators
in the area of “Customer Results” or “Processes and Activities” (e.g., productivity, efficiency, errors, complaints,
and timeliness).
USER RESPONSE HERE :
4.2.1 CUSTOMER DATA COLLECTION AND ANALYSIS METHODS, PROCEDURES, AND/OR TOOLS
Describe methods, procedures, and/or tools used to collect and assess customer efficiency, system usability
and suitability, and customer satisfaction and other data (e.g., performance data, including error data, surveys,
user group meetings, customer focus groups, system data, etc.)
USER RESPONSE HERE :
10
Department of Education
OPERATIONAL ANALYSIS REPORT
4.2.2 CUSTOMER ASSESSMENT GAP ANALYSIS
Compare planned customer satisfaction goals against actual results per the metrics cited in Section 1 of this
report to identify the need for additional functionality and/or improved performance.
USER RESPONSE HERE :
4.3 STRATEGIC AND BUSINESS PERFORMANCE GAP ANALYSIS
Compare strategic and business goals against actual results per the metrics cited in this report to identify the
need for additional functionality and/or improved performance. Summarize the performance metrics collected,
including those for cost, schedule, and risk. Describe instances, causes and impacts where the investment
exceeded or failed to meet expectations.
USER RESPONSE HERE :
11
Department of Education
OPERATIONAL ANALYSIS REPORT
4.4 FINANCIAL PERFORMANCE GOALS AND METRICS
List the indicators/metrics used to measure performance. For example:
•
Cost saving/avoidance identified in the Exhibit 300
•
Return on investment
USER RESPONSE HERE :
12
Department of Education
OPERATIONAL ANALYSIS REPORT
4.4.1 FINANCIAL PERFORMANCE DATA COLLECTION AND ANALYSIS METHODS
Describe the method you are using to measure and track cost, schedule, and performance metrics. Describe
the management technique you are using to monitor metrics against the baseline (monthly status review
meetings, budget reviews, etc).
USER RESPONSE HERE :
4.4.2 COST VARIANCE ANALYSIS RESULTS
Describe the quantitative measures used to measure variance from the baseline. Provide cost variance
analysis details in terms of cost variance percentage (e.g. the investment has cost variance of 8%). Summarize
the results of the financial performance metrics collected. Discuss:
•
Is the performance within limits of variance?
o
If not, what is being done to bring performance back within variance limits?
USER RESPONSE HERE :
13
Department of Education
OPERATIONAL ANALYSIS REPORT
4.4.3 FINANCIAL PERFORMANCE RESULTS
Address the indicators/metrics used to measure the return‐on–investment and payback‐period estimates of
the cost/benefit analyses that were part of the investment analysis.
USER RESPONSE HERE :
14
Department of Education
OPERATIONAL ANALYSIS REPORT
4.4.4 FINANCIAL PERFORMANCE GAP ANALYSIS
Compare financial performance goals against actual results per the metrics cited in the appropriate sections
of this report (e.g. “Financial Performance Goals and Metrics” section) to identify the need for additional
functionality and/or improved performance.
USER RESPONSE HERE :
4.5 RISK ASSESSMENT
Describe the major challenges (if any) confronting the investment and how they are addressed.
USER RESPONSE HERE :
15
Department of Education
OPERATIONAL ANALYSIS REPORT
5 TECHNICAL METRICS AND RESULTS
Technical performance indicators/objectives should be monitored as part of an OA effort to ensure that
performance continues to improve, or at least does not degenerate over time. (See Section 5, “Results”
section in the ED OA Guide for further reference)
5.1.1 TECHNICAL PERFORMANCE GOALS AND METRICS
Describe which technical performance indicators/objectives you monitor, including appropriate items from
the following list (include the units of measure):
•
Functional performance (how long it takes to perform a function using the system; e.g., process a
claim)
•
Frequency and length of unscheduled outages
•
Maintenance and equipment outages
•
Mean time between outages/failures
•
Mean time to restore service
•
Corrective maintenance action labor hours
•
Operational availability
•
Operational productivity measures (e.g., mean time to perform functions)
•
Human‐system error rates
•
Training time to proficiency.
USER RESPONSE HERE :
16
Department of Education
OPERATIONAL ANALYSIS REPORT
5.1.2 TECHNICAL PERFORMANCE GAP ANALYSIS
Compare technical performance objectives/requirements against actual results per the metrics cited in the
appropriate sections of this report (e.g. “Financial Performance Goals and Metrics” section) to identify the
need for additional functionality and/or improved performance. Summarize the analysis results of the
technical performance metrics collected. Describe instances, causes and impacts where the investment
exceeded or failed to meet expectations.
USER RESPONSE HERE :
6 RECOMMENDATIONS
Describe your recommendation to continue the investment as‐is, modify the investment
performance/management in some way, or discontinue the investment.
Justify whether the existing system should continue in operation as is, be enhanced, or be terminated. If the
system is to be enhanced or terminated, summarize the recommended actions to be taken this fiscal year or
in coming years.
(see Section 6, “Recommendations” in the ED OA Guide Section 6 for further guidance)
USER RESPONSE HERE :
17
Department of Education
OPERATIONAL ANALYSIS REPORT
7 ACTION PLAN
In the table below (If applicable), list a summary of actions planned for each performance area (from section 4
above) and the status of these actions. Actions may include plans to conduct analyses for alternate
technologies or obtain more information, in addition to corrective actions to address positive or negative cost
or performance variances. (See Section 7, “Plan of Action and Milestones” section in the ED OA Guide for
further reference)
Planned Actual Planned
Actual
Start Date Start Date Completion Completion Current Status/Progress made
Date
Date
Customer/ Stakeholder Satisfaction
Action Planned
Strategic and Business/Mission Support
Financial Performance
Risk
Technical Performance
18
Attachment 2
U.S. Department of Education
Investment Acquisition Management Team (IAMT)
Customer / Stakeholder Survey Checklist
Department of Education
OPERATIONAL ANALYSIS EVALUATION
[DATE]
CUSTOMER / STAKEHOLDER SURVEY CHECKLIST
Table of Contents
1.
Purpose: ....................................................................................................................................... 3
2.
How to Use: .................................................................................................................................. 3
3.
Information Requirements ........................................................................................................... 3
4.
Demographic Information ............................................................................................................ 4
5.
Surveying Tips and Best Practices: ................................................................................................ 4
2
Department of Education
OPERATIONAL ANALYSIS EVALUATION
[DATE]
1. PURPOSE:
This checklist provides an example of the requirements to be incorporated into customer satisfaction
surveys in order to capture information needed to perform an Operational Analysis.
2. HOW TO USE:
The below list of survey questions can be used as a reference point when developing or updating a
customer satisfaction survey.
Example question:
9 How satisfied are the users with the system’s capabilities and functions?
Those performing an OA must have a systemic approach for recording responses to survey questions.
See below for a sample scale that can be used for recording responses to survey questions:
1
2
3
4
5
Completely Unsatisfied ………………………………………………………………….Completely Satisfied
3. INFORMATION REQUIREMENTS
See below for sample questions to be included in a customer satisfaction survey.
9 How satisfied are the users with the system’s capabilities and functions?
9 How satisfied are the users that the system and its services are available and operational when
needed?
9 How satisfied are the users with the level of effort required to use and/or interface with the
system?
9 How satisfied are the users with the speed in which the help desk acknowledges a user inquiry
or reported problem? (Applicable only to IT systems that maintain their own help desk. )
9 How satisfied are the users with the effectiveness with which the help desk answers questions,
resolves problems, or facilitates the resolution of a problem? (Applicable only to IT systems that
maintain their own help desk.)
9 How satisfied are the users with the help desk’s ability to resolve tickets on the first call?
3
Department of Education
OPERATIONAL ANALYSIS EVALUATION
[DATE]
9 How satisfied are the users with the overall system, considering both its strengths and
weaknesses?
4. DEMOGRAPHIC INFORMATION
The survey must capture the following demographic information.
9 Frequency of use (e.g. While working, how frequently do you use the system?)
9 Length of employment
9 Role at ED (e.g. Data Approver, Reviewer, etc.)
5. SURVEYING TIPS AND BEST PRACTICES:
When developing the survey, start with your objectives, not questions. What is it you want to know?
Then, formulate your question around what it is you are seeking to learn.
9 Introduce your survey, how long it will take, and how the results will be used.
9 Include “Not applicable” where appropriate. Test your survey before administering to evaluate
level of effort required to complete, difficulty of questions, length, and time to complete.
9 Keep questions simple, painless and direct. Minimize the number of questions per page. Survey
response rate will typically increase if a small amount of effort is required to complete.
9 Be objective. Pay attention to the neutrality of words and avoid absolutes like “always,”
“never,” “only,” or “just.”
9 Group similar questions together or in the same area of the survey. The first few questions
should be easy, interesting, and aimed to grab the participants’ attention.
9 Demographic questions or personal information questions are best left towards the end of the
survey.
9 Send reminders during the survey period for those who have not completed the survey. Notify
participants of the results/outcomes.
4
File Type | application/pdf |
File Title | Microsoft Word - DoED Operational Analysis Guide_v6_20110405 - Mike edits incorporated |
Author | josh.miller |
File Modified | 2012-02-27 |
File Created | 2011-04-22 |