A Normative Methodology for the US Army Corps of Engineers Operational Research & Development Project Selection

,


Background
USACE's Engineering Research and Development Center (ERDC) is preparing for a possible R&D budget augmentation of up to 250 million dollars focused on operational research and development that is not a project-based appropriation, nor a broad, long-term, strategic, capabilities-focused initiative.However, the budget provides a short-term, focused capability and could augment project-themed R&D.The elevation of engineering R&D as a function in 2021 to place it on par with Civil Works and Military Programs drives an emerging need for standardized and normative decision aids which encapsulate the goals of ERDC and the demographic needs of the MSCs and districts (Spellmon, 2022).
In October 2022, USACE held its first Operational R&D Workshop.This conference gathered leaders from the MSCs, USACE headquarters, and ERDC to create a plan of action to integrate strategic, tactical, and operational R&D projects.As a guideline, the Operational Research & Development Playbook v1.0 was established to serve as a guide for project selection by outlining key technical drivers and benefits.The eight technical drivers identified in the Playbook v1.0 were: innovation, uncertainties, state of practice, range of applicability, technical risks, historical performance, stakeholder impacts, and dependencies/interdependencies (United States Army Corps of Engineers, 2022).ERDC senior leaders and MSC R&D representatives identified a need for a methodology that weighs projects against one another, which is not currently outlined in Playbook v1.0 (Spellmon, 2022).The proposed solution would screen project proposals and validate project prioritization based on risks, return on investment, and innovation to establish a more uniform consensus of where to allocate funds.

Literature Review
The Engineering Research & Development Center's (ERDC) mission is to help solve our Nation's most challenging problems in civil and military engineering.ERDC is a key functional command of USACE, and it employs capabilities to aid USACE's mission of delivering vital engineering solutions, in collaboration with their partners, to secure the Nation, energize the economy, and reduce disaster risks (Pittman, 2021).R&D efforts are typically long-range, multi-decade packages which address a strategic issue, or short-term packages (within 3-12 months) which react after a problem is experienced for an already funded project.This leads to the importance of recognizing the need for operational (3-5 year) R&D initiatives which are prioritized, resourced, and integrated into the cost-sharing project-partner agreements (Buchanan, 2022).In all cases, implementing and resourcing operational R&D is vital to increasing the efficiency of the projects and reducing risk in the future.
USACE's Engineer and Research Development Center is comprised of five research and development areas, ten strategic campaigns, and seven core competencies (Pittman, 2021).Each of the functional areas of ERDC have varying impacts on the decision-making tool that must be put into consideration when creating the model.It is important to note that USACE MSCs have decisions to make based off their goals, metrics, time constraints, and return on investment for each project proposal.With numerous needs in formulating viable metrics and models, this project aims to provide a decision support tool that will create a prioritized project portfolio given uncertainty of outcomes, including project delivery.The paragraphs below demonstrate the process improvement, model methodology, and validation of the Scalable Weighted Operational R&D Decision (SWORD) tool in relation to the Playbook v1.0 in what will become a v2.0 considered for implementation in 2023.

Systems Thinking and Functional Decomposition
Systems Thinking techniques including Systems Diagramming assisted in understanding of ERDC and USACE complexities which is a key first step in the 5 Step Design Thinking Process which guided the team's methodology development of the SWORD Tool.The 5 Step Design Thinking Process is outlined as Empathize, Define, Ideate, Prototype, and Test (Dam, 2022).This process eased the understanding and interpretation of how a decision-making process might best enhance Operational R&D project prioritization as part of ERDC's portfolio.It assisted in identification of a series of process and model findings to help achieve ERDC and MSC desired end states by considering the most usable tool, and with what vital information to include in the decision process.These steps allow for understanding and defining the current process, brainstorming, and creating a tool that will help the process, and create iterations of tests to validate the tool.
The Integrated Definition Methodology (IDEF0) in Figure 1 is a functional decomposition which addresses the modeling and system needs of an enterprise.Specifically, the IDEF0 method models high-level activities of a process (Hanrahan, 1995).This functional mental model unwraps the numerous contributing factors that influence key strategies and that contribute to the Operational R&D project selection process in ERDC.Following the 5 Step Design Thinking Process, the IDEF0 was utilized during the define step.This process enabled value-focused design which led to the process improvement and SWORD prototype and testing steps.The eight ERDC values that were outlined in the Playbook v 1.0, as shown to the right of Figure 1, were decomposed to establish screening criteria and value measures for the SWORD Tool.

Project Value Modeling Analysis and Initial Model Development
The goal of the project was to design a scalable, predictive, data-driven, and value-focused decision support tool for operational R&D package prioritization and selection.The research team participated in the November 2022, first annual operational R&D Workshop in Vicksburg, Mississippi at ERDC Headquarters.MSCs prioritized R&D projects based on the provided subjective playbook-aided, heuristic-centered evaluation tools to evaluate each proposal.The technical drivers provided by USACE in the Playbook contributed to the process of deriving value, and thus evaluating the given projects to produce a prioritized R&D project list (Operational, 2022).
The first step in analyzing the 8 ERDC values, found in the Operational R&D Playbook, was to assess which of them would perform well as screening criteria, and which would perform well as value measures or metrics.These values were binned into an improved set of screening criterion and value measures that would provide a basis to rank order proposed Operational R&D initiatives.Identification of screening criteria of new technologies, HQ-directed, technology feasibility, innovative concepts, and operational timeframe (3-5 years) would eliminate proposed projects that do not align with the ERDC goals for R&D.Establishing hard screening criteria would improve the current methodology and provide clarity for follow on value assessment.Meeting with the stakeholders assisted with identifying value measures and screening criteria to evaluate project packages.Data was identified from past R&D projects and used in an additive value model, SWORD prototype to provide initial feedback and results demonstrating the model's value to stakeholders.Excel with SIPMath Modeler add-in was utilized to increase usability for USACE practitioners and enable multiple features for increased understanding of the results to decision authorities.

A Novel Process Methodology and Decision Model
This decision tool fits into two places on the timeline of the ERDC R&D project selection process and will be utilized by MSCs 60 days prior to the Operational R&D workshop.This tool will allow for the MSCs to plug in an assortment of possible R&D projects needed, which will then be screened by the screening criteria, then put through the value model, and the project will return a ranked list based on normative, weighted, constructed scale value measures.
The screening criteria portion of the Scalable Weighted Operational R&D Tool, or the SWORD tool, is a series of five questions, using data from each project, that ensures the project stays within the parameters of Operational R&D.These questions come from the ERDC values listed in their Operational R&D Playbook, the breakdown of the 8 ERDC values is shown in Figure 1, as seen the screening criteria makes up for state of practice, interdependencies, innovation, and stakeholder impacts, which are four of the eight values outlined in ERDC's Playbook.After answering the five questions for each of the projects being inputted into the tool, the screening criteria will provide a 'go' or 'no-go' for each project and autofill the "screened list".Each screened operational R&D project was then evaluated using an additive value model, based off four parameters: Risk, ROI, Innovation, and Cost as seen in the left of Table 1.Each of these parameters are put into constructive value scales, based off the defined value scales the user, from the USACE MSCs, will input a low (a), medium (b), and high (c) score for each of the parameters for each of the projects.These value scores create a triangular distribution (X) in accordance with Equation 1 above.The U represents a stochastic value pulled from a uniform distribution, as represented in the "Random" column of Table 1.These distributions then populate the graphs that show the stochastic findings of the value model.The raw inputs from the user are utilized to establish a stochastic distribution of 250 points, which is then normalized through value measures on a scale of 1-100.These outputs are then multiplied by weights that can be dynamically adjusted by the MSC user to achieve the final value score for each individual value.These values for each respective metric are summated for the overall project value.
Table 2 and Figure 2 show the rank order of projects, and value scales for each of the parameters respectively to the decision maker.The distribution for each of the projects that will be used for the stochastic value vs cost plots allowing decision makers to visualize project value and cost overlaps, and thus improve decision quality in representing uncertainty and dominance along a Pareto optimal frontier.

Table 2. Prioritized Project Output and Statistics from SWORD Tool
Table 2 shows the final output of the SWORD tool by ranking each of the inputted projects.As seen above, the final output takes all user input data and ranks the projects in order of value, cost, and the ratio value to cost.This is the easiest of the outputs to understand because it gives clear cut output which will allow decision makers within USACE to decide with analytical support without further analysis.
For the MSCs, Districts, or any users of the SWORD tool to understand how the tool works and how to interact with it, a "how-to" was created and distributed to select clients within USACE to supplement the SWORD tool.This "how-to" document is a step-by-step outline of how to operate the tool and where to input data.This tool also explains what each of the graphs that the tool returns means and how to read each of them.This is vital for the SWORD tool because it will allow it to be easily interacted with and scalable to other business lines of USACE.The final value score is supplemented with eight different graphics that help further analyze project prioritization in a visual medium.These models include value and cost frequency histograms, point plots of standard deviation against value and cost, value and cost cumulative distributive functions, a stochastic distribution analysis, and sensitivity analysis.The stochastic distribution analysis, as shown above in Figure 2, enables a more thorough comparison of the cost-value analysis between all projects.The decision maker also has the option to focus on a two-project comparison and percentages of domination for each project, also shown above in Figure 2.This allows the decision maker to identify what percentage of the time one alternative dominates the others.This distribution analysis aids in visually representing the overall variation the project and its value return could be.The tighter the shot group, the project has a small variation and is likely to be on target.This tool aids in the overall decision risk these projects bring.

Assessing Sensitivity and Validating the Model
The sensitivity of the model was assessed through a tornado diagram to analyze the dynamically variable weights from a normative scale, of each input into the value model.This sensitivity analysis is integral as it helps answer the question, "What makes a difference in this decision?"(Clemen, 2013).In Figure 3, the tornado diagram for the SWORD tool is shown.This tornado diagram reveals that Risk and Innovation have a much larger effect on the value score of each of the projects.This informs the MSC decision maker that the sensitivity of our model is largely affected by the sensitivity of innovation and risk when inputting value scores.ROI does not have a significant effect on the sensitivity of the value scores when inputting different values in Figure 3. Therefore, the sensitivity for ROI is much lower than the other two variables affecting the value score for each project.
To validate the SWORD tool, two MSCs will plug in five projects and work through the tool.The MSCs will then fill out a survey containing the system usability scale (SUS) to provide feedback on the usability and effectiveness of the tool.This is one way to validate the tool through users will assess the usability and effectiveness, but in future iterations the model will be validated comparing model results to actual data points in a validation data set.This tool was briefed to the ERDC and USACE Headquarters, where approval of the screening criteria and the value model followed.This served as a validation of our tool, in showing that the tool will be useful to the client at hand.The team at ERDC approved of the model and explicitly stated that the tool would have a drastic positive effect on the project selection process in a v2.0 of the Operational R&D playbook.

Findings
The feedback on the analysis which led to a process improvement initiative with novel SWORD model illustrated the need for a methodology enhancement in ranking R&D initiatives to allocate resources appropriately and most efficiently.MSCs required a normative approach to quantifying their prioritization of R&D projects in support of decisions to allocate resources for R&D towards very differently scoped, operational projects.Implementing the SWORD Tool was received very well, but still requires additional validation at MSC levels and against validation data sets.As users from the MSC's begin to continually verify the SWORD Tool through its use, it can be more narrowly tailored to fit into the timeline of USACE projects.The

Figure 2 .
Figure 2. Stochastic Project Value vs. Cost Scatter Plot and Comparison Stats Summary

Figure 3 .
Figure 3. Tornado Diagram for Analyzing Sensitivity

Table 1 .
Constructive Value Explanations and User Input Section of Value Model