DAU Story

By: CAPT Tim Hill, USN (Ret.) and Diana Teel

There has been a good deal of both talk and action about speeding the upgrades of U.S. military capabilities, especially given the rapid improvements made by our potential adversaries. Most of the discussion focuses on reducing acquisition cycle times, harnessing the large private industry expenditures on innovation, and seeking new ideas and products from nontraditional sources.

At the end of the day, we struggle to connect a “hard problem” with a viable, even partial, solution by applying funds through some sort of contractual vehicle.

It is fortunate that this effort has not been “all talk and no action.” Congress has expanded authorities and relaxed several restrictions in order to encourage quicker action by the Defense Acquisition Workforce. Additional funding has been appropriated for innovation efforts such as prototyping and rapid fielding. Touch points such as the Navy’s Tech Bridge network have been put in place to foster an ecosystem of innovation within the Department of Defense (DoD) and to make it easier for nontraditional providers to connect.

Even with all of that, little has been done in measuring increased capability development. Rapid fielding and true game-changing capability deliveries remain one-off celebrations instead of the norm. These successes are certainly worth celebrating, but we need more. Unfortunately, DoD’s evaluation process remains flawed when it comes to bringing nontraditional participants to the DoD market.

Typically, the rallying cry to attract these players focuses on “solving DoD’s ‘hard problems,’” ranging from technological leaps such as hypersonic and directed energy weapons, as well as quantum computing and sensing, to much more pedestrian challenges such as improved training to prevent maintenance and critical operational errors. At the end of the day, we struggle to connect a “hard problem” with a viable, even partial, solution by applying funds through some sort of contractual vehicle.

The problem is multifaceted, and it entails the following:

  1. Know/understand/curate the “hard problem.”
  2. Assess technological maturity of the proposed solution.
  3. Exercise sufficient budget flexibility to fund the proposed solution, even if “out of cycle.”
  4. Apply an appropriate acquisition vehicle.
  5. Have a government/industry (and potentially academic) team that can effectively execute solution delivery.

Know and Curate the “Hard Problems”

Today, if an innovator asks a DoD employee, “What are the DoD’s hard problems,” he or she would get a variety of responses depending on the employee’s role, experience level, military Service affiliation, and myriad other factors. Different DoD organizations have their own versions of “hard problem” lists. It might be the “top 10 degraders” for aircraft availability within a carrier air wing, a list of Small Business Innovative Research (SBIR) or Broad Agency Announcement (BAA) topics from a research organization, or the totality of white paper requests within a particular Other Transaction Authority (OTA) consortium. But there is no consolidated and curated list of “hard problems” that totally spans any of the Services, let alone the entire DoD.

Moreover, these existing lists do not provide a view of the “problem” that integrates the perspective of both the acquisition teams and the operators. In many cases, what is covered is more of a symptom than the actual root problem. These lists are very tribal, and any sharing of potential solutions is typically a matter of accidental collisions rather than intentional activities meant to draw communities of interest together. Finally, pairing of potential solutions with actual problems is somewhat haphazard.

Even productive efforts by entities such as SOFWERX or Defense Innovation Unit Experimental (DIUx) depend on “catching a solution with their own net” by drawing potential problem solvers into their organic ecosystem. The Navy may have done slightly better by networking its innovation hubs, known as Tech Bridges, so that theoretically the problems and solutions in any single Tech Bridge ecosystem are shared across the entire network. But while this approach contemplates and encourages collaboration, there is no knowledge management to actively support it. Instead, this approach relies on individual heroics to “connect the dots.”

Some nascent efforts are under way to address this challenge. One such ongoing effort takes place through the National Security Innovation Network (NSIN), where college students explore this knowledge management problem and contemplate potential solutions. Ideally, the knowledge management solution would catalog technological capabilities and associated advancements/solutions separately (and somewhat agnostically) from warfighting problems. This would allow mix-and-match application of technology solutions with operational issues in any pertinent domain, similar to the weapon-target pairing during operations. But the solutions are very overdue and sorely needed if DoD is to realize any real acceleration in its bid to speed up capability delivery.

Assess Technical Maturity

As we attempt to add new technology to our forces, we focus heavily on the technology’s maturity, but we do not adequately assess the ability of the provider to deliver that technology. This is true whether we are dealing with an enhancement to an existing weapon system or an entirely new platform. Prior to the legally required formal milestone certification, technologies must have a formal evaluation. This is typically indicated by a Technology Readiness Assessment (TRA) that results in a Technology Readiness Level (TRL) assigned to the technology. Using data produced by prototypes, lab testing, and other methods, the TRA evaluates the technology’s maturity for use in its intended operational environment.

TRL is expressed as a number, on a scale of 1 to 9, where 9 represents the most mature technology. This metric allows for relatively consistent discussions about emerging technologies. According to the Office of the Secretary of Defense (OSD) TRA Guidance issued in 2011, these are some important TRL milestones:

  • TRL 4, where component/breadboard validation occurs in lab environment, basically validating basic functionality of the integrated system
  • TRL 6, where a prototype is demonstrated in a relevant environment (typically the required level for proceeding beyond Milestone B certification)
  • TRL 7, where the system prototype is demonstrated in an operational environment
  • TRL 8, where the actual system is completed and qualified through test and demonstration
  • TRL 9, where the actual system has been proven through successful mission operation

The TRL is based on a TRA—the compilation of data collected over the life of the technology being evaluated. An example of the TRA checklist is provided from NASA in Figure 1. In some cases, this could cover years of data collected in a wide variety of venues, configurations, and environments. Understanding TRL in this relevant context assists program teams and senior decision makers in characterizing progress and risk to fielding the needed solution within the specified time and budget. It is a critical part of understanding the health of the effort. DoD is generally pretty good at this when it is conducted objectively and in proper context (evaluating the technology in the correct operational environment).

Unfortunately, it is mostly assumed that the technology provider will be able to deliver that technology—at the scale needed for our operations with the applicable certifications to integrate into the existing Fleet safely, efficiently, and securely with a realistic sustainment plan. That issue is discussed in more detail later in this article.

Figure 1. NASA Technology Readiness Level Assessment Matrix
Figure 1. NASA Technology Readiness Level Assessment Matrix

Source: NASA

Apply Appropriate Acquisition Vehicle

Acquisition vehicles connect funding with work that delivers capability. To successfully deliver a required capability, the appropriate acquisition vehicle must be employed. These vehicles can range from a very complex contract to a simple agreement among government organizations—as when a Warfare Center delivers an in-house product for a program to include in the overall weapon system. There are many options along this spectrum of vehicles, and applying the correct tool can mean the difference between success and failure particularly where speed is concerned. Over the last several years, Congress has greatly increased the flexibility available to DoD in applying these vehicles, and DoD has been increasingly willing to use this flexibility. The expanded use of OTA instruments and SBIR vehicles across all life-cycle phases is a testament to this good-news story for DoD.

However, cultural barriers and knowledge gaps remain and prevent achievement of the maximum possible effective speed in DoD acquisition. Some organizations suffer from the “that’s not the way we’ve always done it” mindset, while others have not embraced the potential reward from managing the risk increase with some of these tools. As a result, the lag is due more to the manner of execution than the availability of tools.

Many senior leaders have been quick to point this out, stating that the DoD now has all the authorities required to move quickly, and it simply needs to do so. During congressional testimony in 2018, then-Assistant Secretary for of the Air Force for Acquisition William Roper said: “The committee should declare victory on the reforms. I think it is what we need as Service acquisition executives to go try to restore the appropriate level of decision authority where it belongs.” With the authorities mostly corrected, what remains is to use these authorities fully. The required cultural changes to fully embrace this opportunity are not trivial, but they are very different than a structural problem related to authorities, as are many of the budget flexibility issues.

Exercise Sufficient Budget Flexibility

Anyone who has served as a member of the Defense Acquisition Workforce or participated in budget construction will say that there is never enough money. While tough decisions are always required regarding priorities and current budget levels that failed to anticipate recent inflation, DoD’s budget can still accommodate most true needs. But as was widely reported in the 2022 cycle, the budget process still lacks sufficient agility and flexibility.

First of all, budgeting takes too long and is too bureaucratic. The current process takes at least two years to program funds for a newly realized requirement. Some funds are set aside for contingency requirements or for discretionary departmental use, but these dollars are relatively scarce. The process for reprogramming funds is onerous and fraught with peril at each level beyond the originally programmed account.

If a program manager (PM) can identify funds within that program element, the reprioritizing is relatively easy. However, emergent requirements outside of a formal program, or that require reallocation of funds across programs, are dangerous. It’s easy enough to identify a shortfall and its priority ranking. But once an offset is identified to fill that shortfall, that funding becomes a target for anyone in need. As a result, PMs and budgeteers are “gun shy” about exercising the authorities granted by Congress in funding DoD’s emergent needs.

Compounding the problem, the “execution” mindset within the federal budget fails to encourage efficiency and savings. Instead, funding owners are judged by their ability to spend their budgets. These metrics, known as benchmarks, were implemented to identify available funds (or those likely to become available as the fiscal year progresses). But the net effect is a very stovepiped financial system that encourages hoarding cash over becoming more efficient and reaping rewards for returning funds to meet emergent needs.

The fundamental processes and authorities need not change, but the manner in how these processes and authorities are executed certainly must. DoD personnel must be rewarded at all levels for efficiency increases that free already budgeted dollars for reallocation to emergent priorities. Perhaps this behavior could be incentivized by allowing a PM identifying such a financial asset to retain a portion of it for discretionary use on internal unfunded requirements while returning the remainder for Service-level priorities.

With the authorities mostly corrected, what remains is to use these authorities fully.

Next, DoD leaders, particularly those in financial management (with the help of Congress) need to quell the current “feeding frenzy” surrounding legitimate reprogramming efforts to address emergent requirements. Reprogramming is a necessary and proper action that should not evoke fear. Finally, benchmark metrics should be revisited, not necessarily in terms of where the metrics are set, but more in how they are viewed. DoD leadership could enable honest conversations at all levels about making available dollars that become at-risk due to program delays, efficiencies realized, or numerous other reasons. And making those funds available for other programs or purposes should incur no cuts or retribution to the donor in a subsequent year.

Figure 2. Sample Top-Level Acquisition Readiness Level Checklist

Figure 2. Sample Top-Level Acquisition Readiness Level Checklist

Source: The authors

Ability to Execute Solution Delivery

As mentioned previously, DoD is quite adept at evaluating the maturity of specific technologies for an intended mission. However, DoD doesn’t always evaluate a provider’s ability to deliver that technology at the needed scale while using the planned acquisition strategy. Furthermore, the government team’s ability to execute and oversee that acquisition strategy is not generally evaluated.

There are many examples of where these oversights have delayed fielding, reduced performance, or created unsatisfactory availability of the technology. The missing assessment, an Acquisition Readiness Level (ARL), is especially critical as DoD courts nontraditional providers that may not have the required infrastructure and DoD acquisition experience to succeed in the required task. We propose that there be a formal assessment by the government product team or some sort of senior evaluation group that quantifies the provider’s readiness to produce the technology as intended.

A top-level example of what an ARL checklist might look like appears in Figure 2. Maturity would be evaluated for each line item indicated (and potentially more, based on the full development of this concept). This maturity would be rated in both an unmitigated condition accounting for only the organic capability and/or experience of the provider and a mitigated condition that would consider any assistance or mentorship provided by the government team or an industry source. Just as in many existing programs, it is certainly plausible that the program team may choose to move forward with low maturity in one or more categories not mitigated to “green.” Compared with unconscious assumptions that now may occur, this would be a very intentional choice by the combined government-industry team, likely identified as a program risk and then tracked and mitigated as such.

It also is important to consider that the ARL checklist would not be a single checklist for all potential combinations of provider-technology-acquisition strategy; the ARL assessment must be tailored to the situation. This is extremely similar to defining the “relevant” and “operational” environments for assessing TRL.

For example, adding the latest sensor technology to a strike-fighter through a spiral development of an existing sensor through the aircraft’s Original Equipment Manufacturer (OEM) using traditional contracting methods as part of a planned block upgrade may result in a very high ARL. In fact, many attributes of that plan—OEM involvement and coupling with a planned block upgrade are prime examples—minimize the acquisition risk. However, with some newer strategies relying on commercial or start-up companies to deliver critical capability—potentially as a standalone, out-of-cycle upgrade—there may be more risk in achieving the desired ARL.

If the provider has never contracted using a Federal Acquisition Regulation (FAR)-based contract and has no certified accounting system (both are required for the planned acquisition strategy), the program may be even riskier when considering only technical maturation. Intentionally evaluating these maturity aspects will allow the technology provider and the government product team to work together on mitigating the ARL risk.

That mitigation could take many forms. One option would be for the government team to assist much more actively than in the past. That path certainly poses additional risks from a contractual standpoint. Or a mentor could be found to help the technology provider fill gaps in the ARL assessment. For instance, if that provider who lacked FAR-based contracting and certified accounting experience were teamed with a mentor from the defense industry, the provider likely could get the help needed to ensure that the required deliveries are made. This is not so different than the formal and informal mentor-protégé arrangements that exist today. The main differences are the following:

  1. The government (or its innovation center partner) would likely be involved in informing the technology provider that it needs a mentor to reduce delivery risk to an acceptable level.
  2. The mentor-protégé arrangement would likely be much more specific and limited (at least initially) to the task at hand.
  3. IP protection for the protégé company would be a central part of the arrangement.

As the technology provider is evaluated, the government team should also be evaluated for its ability to execute and oversee the required acquisition strategy. It is well understood that successful fielding of any capability requires an integrated government-industry team, and this assessment would be part of that process, similar to how pre-deployment assessments are done for any operational unit. Those units, capabilities, proficiency, and readiness factors are evaluated against an objective standard with deficiencies either remediated or accepted as risk to mission. Similarly, the government team could be evaluated and gaps mitigated through “top-off” or refresher training or by targeted personnel replacements. This would allow achievement of a required ARL for the fully integrated government-industry team.

Implementing some version of ARL and the recommended mentor-protégé arrangement could significantly increase the success rate of innovation.

The Path to Success

It is time to recognize that many things have been done well in recent years to speed innovation and capability delivery. Through many of these actions, acquisition leaders have removed “handcuffs” from DoD’s acquisition workforce and the innovators that they partner with.

However, several actions are needed if DoD is to truly increase the rate of capability delivery, especially when considering the desire to harness commercial innovation and nontraditional providers. The most critical gaps today are funding agility, curating the “hard problems,” and supporting innovators all the way to delivery and fielding within the DoD acquisition structure. DoD should address those gaps as quickly as possible if we are to remain ahead of our potential adversaries.

Hill is a recently retired Naval Flight Officer and acquisition professional. Following his role in operational squadron command, he served in a series of acquisition leadership positions at the F-35 Joint Program Office, the Long Range Anti-Ship Missile Deployment Office, and Naval Air Warfare Center Training Systems Division (NAWCTSD–NAVAIR) in Orlando, Fla. These positions highlighted the need for speed and innovation to maintain U.S. competitive advantage.

Teel is a career Navy civilian employee, most recently serving as the Director of Outreach and the Central Florida Tech Bridge/Tech Grove at the NAWCTSD. Her roots as a Logistics Specialist and her experience standing up one of the Navy’s initial Tech Bridges (and the first to model Joint Service collaboration) have ignited a passion for agility and critical thinking.

The authors can be contacted at hill@irtc-hq.com and diana.c.teel.civ@us.navy.mil.

People who read this article also found these articles interesting :