News | Feb. 7, 2020

A Blue-Collar Approach to Operational Analysis: A Special Operations Case Study

By Steven J. Hendrickson and Riley Post Joint Force Quarterly 96

Download PDF

Steven J. Hendrickson is a Lead Associate with Booz Allen Hamilton, Inc., and Task Lead for the U.S. Special Operation Command J52 Strategic Analysis Cell. Lieutenant Colonel Riley Post, USA, is the Program Manager for Army Talent Based Branching at West Point.

SEALs participate in ground mobility training with Mine Resistant Ambush Protected armored vehicles, Forward Training Area, March 28, 2012 (U.S. Navy/Meranda Keller)
SEALs participate in ground mobility training with Mine Resistant Ambush Protected armored vehicles, Forward Training Area, March 28, 2012 (U.S. Navy/Meranda Keller)
SEALs participate in ground mobility training with Mine Resistant Ambush Protected armored vehicles, Forward Training Area, March 28, 2012 (U.S. Navy/Meranda Keller)
SEALs participate in ground mobility training with Mine Resistant Ambush Protected armored vehicles, Forward Training Area, March 28, 2012 (U.S. Navy/Meranda Keller)
SEALs participate in ground mobility training with Mine Resistant Ambush Protected armored vehicles, Forward Training Area, March 28, 2012 (U.S. Navy/Meranda Keller)
Photo By: MC2 Meranda Keller
VIRIN: 120328-N-PA426-164

For many military commanders, the word assessment induces bouts of eye-rolling, daytime drowsiness, and, in some cases, mild nausea. This condition typically results from years of exposure to well-intentioned analysts briefing either overly complicated analysis that is unintelligible to all but the presenter or, on the other end of the spectrum, overly simplified stoplight charts and thermographs aggregated into trivial and often deceptive “trends.” As analysts responsible for organizing the commander’s assessments at Special Operations Command Central (SOCCENT), we have, at times, been those briefers, struggling to provide value to the command. However, through trial and error over 4 years and with three different commanders, we narrowed in on an analytic process that both informed decisions and catalyzed organizational change at SOCCENT. Our goal in this article is to distill those years of experience into a set of simple principles that are useful to any commander and applicable across a wide variety of requirements.

This article stands on two assumptions about value-added operational analysis: the analysis has to be right, and commanders must use it.1 Since commanders, not operations research and systems analysts, make assessments, analysis results are only valuable if they are trusted and allow the commander to produce faster or more informed decisions. Commanders are, above everything else, decisionmakers, and good analysis will lead to better or quicker decisions. We hope this article helps both commanders and their staffs avoid some of our mistakes along the path to more accurate and useful analysis, allowing for improved decisionmaking across an organization.

Context

Our earliest attempts at operational analysis in SOCCENT were neither overtly right nor useful to the commander. Relying on a traditional doctrine-guided approach, we attempted to answer “how are we doing?” by translating task accomplishment to objective achievement through measures of effectiveness (MOEs) and measures of performance (MOPs). This method focused extensively on activity, such as number of engagements and partner-nation units trained, rather than understanding effects the unit created in the operational environment. Then we attempted to translate that activity to an estimate of progress toward achieving objectives.

As we forced doctrine to fit our problem, we began to recognize five underlying behaviors that undermined the value of our work:

  • We did not seek to answer—or even understand—the most pressing question for the commander. Although our process reflected the campaign plan goals, we wasted energy answering questions of little consequence to the commander.
  • We passively used the data we had rather than actively collecting the data we needed. When we should have been asking, “What data do I need to learn?” we were asking, “What can I learn from the data?”
  • We isolated ourselves from the rest of the staff and made no effort to build relationships with forward elements. This relegated us to being graders of the commander’s homework rather than an integrated evaluation and feedback mechanism for plans.
  • We compounded these mistakes by quantifying and aggregating everything through a complicated system of questionable mathematical models.
  • We did not ask, “At what cost?” and so we could not help the commander understand the amount of resources applied to create the outcomes we observed.

Not surprisingly, for the first few years neither the SOCCENT commanding general (CG) nor any other senior leader within the organization found our work to be particularly useful. As keepers of the data, we would get the occasional request for information, but rarely did anyone use our analysis to make meaningful decisions. Moreover, despite our best efforts to convince people otherwise, none of our forecasts (read: guesses) of future outcomes gained traction. In short, we were a marginalized team, spending our days nurturing a complicated model that nobody seemed to care about. Something needed to change.

A Better Way

In late 2015, we scrapped our existing methods and charted a new path. We stopped adhering to common practices, including the strict mechanical process rooted in MOEs and MOPs. Instead, we developed what we view as a “blue-collar business case” analysis focused on measuring and articulating SOCCENT return on investment (RoI)2 to resources in areas of operation (AOR).3 In doing so, this process:

  • described SOCCENT’s allocation of resources across the AOR
  • articulated current progress toward objectives according to the commander’s stated priorities
  • identified gaps relative to desired outcomes in the AOR
  • recommended measures to address those gaps with future investments or divestments across the AOR.

Despite its flaws and room for improvement, the SOCCENT commander deemed our new process effective because it produced digestible and analytically sound outcomes that commanders and staffs across the enterprise used for making resource allocation decisions, communicating outside the organization, and building future plans. These outcomes manifested at multiple levels of the enterprise, from civil affairs teams adjusting their areas of focus to the CG redirecting Marine Special Operations Teams (MSOTs) across the battlefield.

In the course of building our new way ahead, we identified seven keys to success or guiding principles:

  • Answer the question of interest to the command.
  • Tie all analysis to clearly defined and agreed-upon requirements.
  • Be proactive about data collection.
  • Be value-added at multiple levels.
  • Build collaborative networks to execute, verify, and validate analysis.
  • Resist the tyranny of averages and aggregation wherever possible.
  • Understand that products matter, but not as much as the process.

The remainder of this article focuses on presenting these seven guiding principles and illustrating how to replicate our process in almost any command.

1: Answer the Question of Interest to the Command

At SOCCENT, we found answering a single question, the one most prominent in the CG’s mind, provided a coherent logic for motivating both staff and subordinate commanders to actively participate in the analytic process. In essence, if the boss cares about a topic and is constantly asking about it, the individuals in the unit want to be part of the answer. Fortunately, we enjoyed an environment of shared information and openness to inquiry. This allowed our team to attend strategy sessions with the CG, his team of directors, and subordinate commanders. It also gave us the space to iterate with the CG to identify what analysis he found useful. Through this combination of passive and active elicitation, we identified the following question of interest to the CG: “Given a finite number of special operations forces [SOF] and a nearly infinite demand for their capabilities, how does SOCCENT allocate its SOF to maximize achievement of planned objectives in the U.S. Central Command [USCENTCOM] AOR?”

The scarcity of SOF relative to demand prevents SOCCENT from applying high-end human capital to every problem set in the AOR. For every application of SOF against one problem, there is an inherent opportunity cost of not investing somewhere else. The question, then, is not simply, “How is SOCCENT doing relative to its stated objectives?” but rather it is a more expansive inquiry that considers the opportunity cost of accomplishing those objectives. In simplest terms, this is an RoI question, the answer to which requires a clear understanding of resources available, CG priorities, the expected returns to any given investment of SOF, and an evaluation of what actually materializes in the operational environment. In theory, there existed some optimal allocation of SOF that maximized SOCCENT’s effect in the AOR. We built our analysis to move the command toward that allocation.

Servicemembers assigned to Naval Special Warfare Group 2 conduct military dive operations off East Coast of United States, Atlantic Ocean, May 29,
2019 (U.S. Navy/Jayme Pastoric)
Servicemembers assigned to Naval Special Warfare Group 2 conduct military dive operations off East Coast of United States, Atlantic Ocean, May 29, 2019 (U.S. Navy/Jayme Pastoric)
Servicemembers assigned to Naval Special Warfare Group 2 conduct military dive operations off East Coast of United States, Atlantic Ocean, May 29,
2019 (U.S. Navy/Jayme Pastoric)
Servicemembers assigned to Naval Special Warfare Group 2 conduct military dive operations off East Coast of United States, Atlantic Ocean, May 29, 2019 (U.S. Navy/Jayme Pastoric)
Servicemembers assigned to Naval Special Warfare Group 2 conduct military dive operations off East Coast of United States, Atlantic Ocean, May 29, 2019 (U.S. Navy/Jayme Pastoric)
Photo By: Senior Chief Petty Officer Jayme
VIRIN: 190529-N-XD935-010

2: Tie All Analysis to Clearly Defined and Agreed-Upon Requirements

Every organization faces requirements. In the financial world, the requirement is clear: apply human and physical capital to generate a profit, and measuring returns is a simple accounting drill. In organizations not driven by profit, such as the military or other public-sector entities, measuring and articulating RoI is more challenging. For example, no commonly agreed-upon method exists for measuring and comparing investments and returns between training a partner SOF unit, conducting a key leader engagement with partner special force commander, or exploiting the information environment to degrade support for violent extremist organizations.

To standardize RoI measures, we defined returns and currency in an operational context. Returns were either desired or actual:

  • Desired returns: Objectives in regional plans, or the state of the operational environment that SOCCENT expected to materialize by applying SOF resources to them.
  • Actual returns: The observable impact SOF resources—through the execution of operations, actions, and activities (OAAs)—had on objectives.

Using these definitions, we were able to standardize and defensibly articulate comparisons of outcomes to the commander’s expected outcome.4

Next, we defined a standardized measure to make comparisons of investments across units. We settled on man-days of SOF as the unit of measure for resources applied to an OAA. For example, a 12-man Special Forces Operational Detachment–Alpha (ODA) conducting a 10-day training engagement in Lebanon would count as a 120-man-day (12 men x 10 days) investment applied in Lebanon.5 Although this approach did not capture every SOF investment in the region, it did encompass the majority of activities and, more important, focused on the operational units that could be shifted from one mission set to another. Furthermore, it was a way of measuring SOF investment across all types of OAAs, campaigns, and phases of war.

Critically, the process of measuring investments and returns relative to desired outcomes pinned the analysis to clear requirements, lending validity and, ultimately, utility to decisionmakers. The direct reliance on the SOCCENT plans to drive analytic requirements also allowed us to provide constructive feedback to the planners at the end of the analysis cycle.

3: Be Proactive about Data Collection

Once we determined the critical question and defined requirements, we identified the data we needed and the person or unit most likely to have that data. In person, proactive data collection fundamentally changed our process. We argue that it is the differentiator that enabled a useful analysis and elevated our team to an integral element of the command.

In April or May of any given year, staffers around the Department of Defense receive the dreaded annual analysis data call tasker from “higher.” Almost without fail, it comes in the form of a lengthy, confusing email with an equally confusing Excel spreadsheet attached, or an equally unhelpful Task Management Tool message. In turn, these staffers push similar requests throughout their organizations and subordinate units until some poor captain or major is stuck with the task. Not surprisingly, the returning data vary greatly in quality and are wholly dependent on the knowledge, competence, and motivation of the respondent. The result is a mixed bag of high-quality, detailed data and check-the-box drivel—the combination of which precludes useful analysis.

To overcome this plight, we physically went to the source of the data. In practice, this often required traveling across the AOR to conduct in-person interviews with forward commanders and operational units executing SOCCENT orders. In other cases, the only travel required was foot movement to another staff section, such as the J2 or J4. In all instances, though, we built data collection platforms tailored to the type and source of data we needed and followed up in person.

Regardless of data type, we supplemented all primary source data by data mining open-source and classified reporting before and after in-person visits. Doing so allowed us to capture data already provided through situation reports, intelligence information reports, and other data provided by the operator, allowing us to focus personal interactions on data gaps rather than burdening the operator with questions already answered in reporting. Lastly, because our interview sample size was small, we found the supplementing data useful for a broader perspective and clarification.

This multisource data collection approach reaped several benefits to include increased detail and veracity of data, insight for subordinate commanders, the development of a collaborative analytic network, and increased buy-in to the process across the command. The active approach to data collection also gave the analytic team unique insights into a wide spectrum of issues across the command, affording it the opportunity to contribute to teams and projects outside of its normal analytic requirements.

4: Be Value-Added at Multiple Levels

In SOCCENT, forward operational units held the keys to the best data available. However, these operational elements are mostly ODAs, MSOTs, and SEAL platoons—tightly knit groups wary of “outsiders.” These teams typically operate at a tempo and in an environment that is not conducive to site visits from data collectors.

To solve this access problem, we flipped the traditional analysis approach on its head, focusing on providing an analytic service to the forward node rather than simply seeking data for the higher headquarters analysis. In part, this approach originated from an unanticipated stroke of good fortune. In October 2015, a U.S. SOF commander in Lebanon asked our team to review the progress of his command and provide recommendations for resource allocation as well as future campaign activities. This commander also happened to be one of the more vocal leaders in the SOCCENT enterprise. When our analysis and products exceeded his expectations, he became our best advocate, using our products to articulate his progress and intent to his peers, the CG, and leaders of outside organizations, including the Ambassador and Embassy staff. His advocacy opened doors throughout the command, allowing our team to visit SOCCENT subordinate units in every corner of the AOR.

With our foot in the door, we established relationships with other forward commanders—the primary consumers and advocates of our product—to apply concentrated analytic capability to their most pressing concerns. When meeting with the commander, “How can we help you answer your mail?” was always one of the first questions we posed. Most of our subordinate commands lacked the staff manning to dig deeply into anything other than immediate mission requirements. As an analytic team, we viewed ourselves as a temporary staff element for the forward commander and took on whatever analytic challenge he faced at the time.

We reaped significant benefits from focusing on the analytic needs of the forward command. Because the concerns of the forward command overlapped significantly with those of SOCCENT, the data we collected fed analysis for both the forward node and SOCCENT commander. Additionally, because the work we did directly supported the forward commander, he and his team were engaged in verifying and validating the products we produced post-visit. Without exception, we received clarifying or correcting comments from the commanders that ensured our analysis was current and accurate before release to the CG.

As forward commanders used our products to brief the SOCCENT CG on their progress and concerns, the boss became acquainted with the results prior to our engagements with him. Rather than a formal brief, discussions among our team, the CG, and the forward nodes became environments of shared consciousness and the dialogue centered on future action rather than a review of the past.

5: Build Collaborative Networks to Execute, Verify, and Validate Analysis

The process of active data collection also builds a network of collaborators useful for executing, verifying, and validating analysis. The network is undefined in advance of the analysis, but, in our case, it included forward commanders and their units; staff officers and analysts within the SOCCENT staff; and subject matter experts from across DOD, the interagency community, and private sector. The combination of internal and external collaborators provided what we believe was an optimal mix of first-hand knowledge and outsider perspective. It also allowed our team of two to three people to conduct in-depth analysis for a command spread across the U.S. military’s most active region of the world.

As important as our data collection approach was, it would not have been possible without support and buy-in from both the forward commanders and key staff members at MacDill Air Force Base. At SOCCENT, the J5 director rightly mandated collaboration between planners and analysts. That collaboration was critical for building measurable requirements, garnering buy-in from the planners, gathering data, and validating results. We also built similar relationships within and across the SOCCENT J2 and J3 directorates, with USCENTCOM and U.S. Special Operations Command (USSOCOM) staffs, and with subordinate units.

In many cases, though, we relied on analysts from across DOD, the interagency community, and private sector to provide external analyses of the operational environment relative to SOCCENT objectives. Using external analysts and companies mitigated confirmation bias and provided multiple lenses through which we viewed the problem set. We also employed two specialized private research firms to help us understand the human element of the operational environment. Combined with our own internal analysis, the networked approach provided multiple perspectives on complex problems, increasing our confidence in common findings and driving further research in areas of divergence.

6: Resist the Tyranny of Averages and Aggregation Wherever Possible

The challenge for analysts at component or higher headquarters, such as a Theater Special Operations Command or combatant commands, is to use as granular data as possible but communicate useful findings at operational and strategic levels. Unfortunately, averaging and aggregating results destroy the fidelity and value of otherwise valid analyses. Some analysts refer to this as color math where, by bending the laws of math, a series of red, amber, or green indicators are “averaged” to produce a single color indicator for a strategic issue.6 The reality is that the strategic issue, represented by a single color, is actually a collection of small issues, each possibly on a different part of the spectrum. In this case, establishing clear requirements, collecting high-quality data, and building a networked team of commanders and analysts are all for naught because the process of aggregation has diluted or obfuscated findings, making them inaccurate and dangerous.

To fight the tyranny of aggregation and preserve the fidelity of findings, we used a combination of nuanced narrative and supporting visualizations. We intentionally did not use averaged numbers, thermographs, or other techniques common to DOD analyses; they are misleading, arguably inaccurate, and are for good reason viewed with significant suspicion by most SOF commanders. Instead, we relied on a logical framework that guided our translation of raw data to influence objectives through criteria, effects, and intermediate military objectives (IMOs). Figure 1 shows a simplified version of this framework.

Figure 1. Analysis Framework
Figure 1. Analysis Framework
Figure 1. Analysis Framework
Photo By: NDU Press
VIRIN: 200207-D-BD104-022

The framework decomposed plan objectives into IMOs and their related effects.7 We developed criteria for each effect that answered the question, “What does it mean for this effect to materialize?” For example, if plans called for a partner force to conduct counterterrorism, criteria may have been the unit’s ability to execute lethal and nonlethal find, fix, finish, exploit, analyze, and disseminate functions. After we collected, validated, and adjudicated the data, we did not aggregate the results. We kept our sleeves rolled up and wrote nuanced, qualitative descriptions of progress and gaps at the effect and IMO levels. While we would further distill these narratives for specific reports, detailed, qualitative evaluation at the objective level and below gave commanders the detail they needed for reallocation of resources.

Instead of stating we were yellow on a scale of red to green, we found that a narrative focused on successes and gaps in the context of each objective was the most effective form of articulating RoI. In addition to presenting findings in the context of cost, this method yielded palatable and pragmatic recommendations for commanders to make the most efficient use of their limited resources, whether at the country, regional, or AOR level.

Our approach also heavily leveraged information graphics to augment results. Using visualizations that preserved important differences within broader themes allowed the consumers, often commanders within the SOCCENT enterprise, to determine what mattered and what did not within any larger strategic issue. However, we always emphasized the importance of the narrative over the visualization.

We also avoided aggregation by communicating results of analysis often, at multiple levels, and in varied forms. While we socialized initial drafts of the analysis with SOCCENT staff action officers, we never considered analysis complete and ready to brief to the CG until the forward commander had reviewed and approved it. In every case, forward commanders welcomed the “good” with the “bad” news in the reports, likely because they viewed them as accurate reflections of reality. These reports were rich in detail and light on summaries. With concurrence and participation from the forward nodes, we engaged the CG with executive-level briefs that focused on the most important returns on the investment to a given problem set. Because the forward commanders typically video-teleconferenced in, these briefings became an opportunity for the CG, staff, and forward node to agree on a common understanding of the ground truth and to craft courses of action for increasing returns moving forward.

Once briefed to the CG, we circulated the findings with other USCENTCOM components, USSOCOM resource managers, and anyone else that would benefit from understanding how SOCCENT was using SOF resources. The analytic cycle culminated with an AOR-level summary presentation at the SOCCENT Commander's Conference (SCC). For that brief, we distilled findings at the objective level for each forward node into five pages of analysis for the CG that focused on each of his strategic objectives. Because almost every commander in the room had seen earlier, more detailed variants of the analysis, the presentation facilitated a productive discussion about reallocation of SOF resources across the command.

Soldiers with Special Operations Command South prepare to board Army helicopter assigned to Joint Task Force–Bravo’s 1st Battalion, 228th Aviation
regiment, during joint airborne operations exercise at Soto Cano Air Base, Honduras, February 22, 2018 (U.S. Army/Maria Pinel)
Soldiers with Special Operations Command South prepare to board Army helicopter assigned to Joint Task Force–Bravo’s 1st Battalion, 228th Aviation regiment, during joint airborne operations exercise at Soto Cano Air Base, Honduras, February 22, 2018 (U.S. Army/Maria Pinel)
Soldiers with Special Operations Command South prepare to board Army helicopter assigned to Joint Task Force–Bravo’s 1st Battalion, 228th Aviation
regiment, during joint airborne operations exercise at Soto Cano Air Base, Honduras, February 22, 2018 (U.S. Army/Maria Pinel)
Soldiers with Special Operations Command South prepare to board Army helicopter assigned to Joint Task Force–Bravo’s 1st Battalion, 228th Aviation regiment, during joint airborne operations exercise at Soto Cano Air Base, Honduras, February 22, 2018 (U.S. Army/Maria Pinel)
Soldiers with Special Operations Command South prepare to board Army helicopter assigned to Joint Task Force–Bravo’s 1st Battalion, 228th Aviation regiment, during joint airborne operations exercise at Soto Cano Air Base, Honduras, February 22, 2018 (U.S. Army/Maria Pinel)
Photo By: Maria Pinel
VIRIN: 180222-O-VI420-0148

7: Understand that Products Matter, But Not as Much as the Process

Rather than producing a single product briefed to the commander at the end of the cycle, we developed and executed a qualitative, evidence-based analysis process. The process supported planning and informed resource allocation decisions throughout the year and at multiple echelons of the SOCCENT enterprise. Championed and advocated for by J5 leadership at its onset, the collaborative process unified planners, executers, and analysts into a coherent cycle.

Because of budgetary and manning cycles, we believe the ordering of this process matters. Figure 2 captures what we found to be a rigorous and repeatable way to execute the analytic process while remaining integrated with planning and resourcing. Of course, any unit applying this process will need to modify it to its specific requirements and, importantly, to the commander himself.

Figure 2. Analysis Timeline and Process Example for Phase 0 Operations
Figure 2. Analysis Timeline and Process Example for Phase 0 Operations
Figure 2. Analysis Timeline and Process Example for Phase 0 Operations
Photo By: NDU Press
VIRIN: 200207-D-BD104-023

In our case, the resourcing cycle turned on the SOCCENT SCC. During the conference and after briefings by all subordinate commanders, the CG would establish priorities and provide guidance for the coming year. Therefore, as shown in figure 2, we completed analysis for subordinate commanders the preceding fall and were able to brief the country and regional findings along the way to both the CG and subordinate commanders. By the time we briefed the final AOR-level analysis to the commander before the SCC, he knew it reflected reality on the ground.

During the conference, commanders used the information generated by our process as a basis to have an informed discussion about what resources they needed to achieve their objectives, while planners used it to calibrate objectives for the upcoming year’s plans. Outside of SOCCENT, the commander also used the results to justify his resource requirements to USSOCOM post-conference. Much of this was possible because we were included as members of the operational planning teams from the beginning of each plan that SOCCENT produced in the spring. Integration with those teams also gave us the legitimacy to make recommended changes at the end of the analysis cycle.

Navy seaman guides Egyptian Naval Force S-70B Sea Hawk helicopter onto flight deck of USS Carney during exercise Bright Star 2018, in Mediterranean
Sea, September 10, 2018 (U.S. Navy/Ryan U. Kledzik)
Navy seaman guides Egyptian Naval Force S-70B Sea Hawk helicopter onto flight deck of USS Carney during exercise Bright Star 2018, in Mediterranean Sea, September 10, 2018 (U.S. Navy/Ryan U. Kledzik)
Navy seaman guides Egyptian Naval Force S-70B Sea Hawk helicopter onto flight deck of USS Carney during exercise Bright Star 2018, in Mediterranean
Sea, September 10, 2018 (U.S. Navy/Ryan U. Kledzik)
Navy seaman guides Egyptian Naval Force S-70B Sea Hawk helicopter onto flight deck of USS Carney during exercise Bright Star 2018, in Mediterranean Sea, September 10, 2018 (U.S. Navy/Ryan U. Kledzik)
Navy seaman guides Egyptian Naval Force S-70B Sea Hawk helicopter onto flight deck of USS Carney during exercise Bright Star 2018, in Mediterranean Sea, September 10, 2018 (U.S. Navy/Ryan U. Kledzik)
Photo By: U.S. Navy/Ryan U. Kledzik
VIRIN: 180910-N-UY653-589

Conclusion

A commander’s job is to give guidance and make decisions, and operations research and systems analysts should make those jobs easier by providing data that inform those decisions. We are confident that adhering to the seven principles described here will put any commander and his staff on the right path to conducting useful analysis that leads to better and/or more timely decisions. Like most valuable innovations, the process that we settled on was the accumulation of failures, tinkering, and refining. And while there is certainly more space for improvement, the advantages of the current approach are clear.

First, the analysis is more useful. Instead of answering a vague “How are we doing?”-type question, we provided the commander an ability to understand real-world outcomes, the opportunity cost associated with those outcomes, and information about how, if at all, he might produce better results with a reallocation of resources. The process is useful because it ties outcomes to requirements that matter to the commander.

The process produces more accurate and timely analysis built on better and more current data. Where passive data collection produces stale, incomplete data, in-person interviews allow the data collector to ask questions of primary data sources that he ties to the commander’s requirement. Building a networked team of supporting analysts also increases diversity of observations, reducing the chances that one perspective creates a biased depiction of reality.

Finally, relying on rich narrative and supporting visualizations drives analysis away from the color math and death-by-aggregation that dooms traditional analyses. In most cases, granular, contextual data are the only means of conveying the nuance of a situation. Since the process is iterative and not a once-a-year event, commanders at all levels can take in the details necessary to understand summarized documents later in the process.

We should like to finish by making a few suggestions about the people and skills needed to conduct analysis like this. Over the course of the 4 years that we tried, failed, innovated, and improved, our team took on multiple configurations. We have had military and contractors as properly trained operations research and systems analysts, social scientists, mathematicians, and lawyers on the team. Regardless of the titles, the team needs members with two primary skills: interpersonal skills for building teams and networks, and logical/analytic skills for building frameworks and conducting analysis. Neither skill set is sufficient by itself, and both are necessary for the team to work properly.

Ultimately, providing useful analysis that allows commanders to make better decisions does not require a Ph.D. or a mastery of rocket science. It requires answering questions that matter with quality data in a manner that articulates rather than averages the truth. Adhering to the seven principles in this article and applying them through a disciplined process with an enthusiastic and hard-nosed analysis team will do that for the commander in almost any environment. JFQ

Notes

1 Although we understand there are infinitely many types of analysis conducted across the Department of Defense (DOD), this article focuses on operational analysis. We define this as analysis conducted to inform the commander’s decision cycle at the operational and strategic levels of war.

2 Return on investment (RoI) is a quantitative metric used to describe efficiency of an investment. Although further research is warranted to fully define the military application of RoI, the principles still apply in this context. We are comparing what was invested, what was returned, and what was expected to be returned. Like RoI in a financial context, this yields an understanding of force efficiency but in the context of campaign plan objectives. Furthermore, our application of RoI is based on economic costs vs. accounting costs and therefore does not lend itself to a quantitative comparison as does RoI in a financial context.

3 Neither the Special Operations Command Central (SOCCENT) commanding general nor the authors view national security as a business. That said, some commonly understood business terms can be useful for conceptualizing a security problem.

4 Our process did not claim to identify causal relationships between operations, actions, and activities (OAAs) and the state of the operational environment. We also did not weight actual returns or attempt to articulate that one OAA had more of an impact on an objective than another. Because the importance of any objective could change at any time, we preferred to clearly state the actual returns rather than weight their importance to the objective, and allow the decisionmaker, typically the commanding general, to determine the relative importance of any given outcome.

5 We understand there are other investments, such as equipment, training, and so forth, but for analysis scoping reasons, we decided to focus on the scarcest and most important resource: special operations forces operators.

6 As one enlightened staff officer at SOCCENT stated, “Averaging colors is about as useful as comparing apples to dump trucks.”

7 We understand that the terms objective, intermediate military objectives, and effect have meaning in joint doctrine. However, we found it necessary to create specific definitions based on the logic of our analysis framework.