At 0900, they track a man on a screen half a world away, following his every movement for several hours; at 1700, they are stuck in traffic on the way home. Yet somewhere between those moments, a life may have been ended under military procedures most people never see or even think about. Who lives this fragmented existence, and what happens when the mission follows them off the battlefield? And when the weight of that unseen decision settles, it is often a military chaplain, a guardian of conscience, a confidential counselor, or an advisor on matters of moral injury who sits across from them in quiet moments, helping them confront what they conducted in their country’s name.

Since early October 2001, remotely piloted aircraft (RPAs), have become central to international U.S. military operations, particularly in counterterrorism and continue to grow in usage.[1] Unlike conventional warfare, RPA strikes are conducted from thousands of miles away, often from bases in the United States by American pilots and sensor operators. While this technology offers strategic advantages in terms of intelligence, surveillance, reconnaissance, and precision strikes, it also introduces complex legal, ethical, and psychological dilemmas for American service personnel who fly them. This paper examines the role of RPA pilots and sensor operators in the U.S. military’s targeted killing operations, with a specific focus on the General Atomics MQ-9 Reaper. It explores the operational structure, emotional toll, and broader implications of RPA warfare on military personnel. While the author’s research primarily focuses on RPA within the United States Air Force, the structure and methodology of the Reaper are consistent across all branches of the military.

The use of RPAs in targeted killing, often against suspected militants and rarely against U.S. citizens[2] can constitute extra-judicial killing. Extra judicial killing is, “the deliberate act of killing individuals by government authorities or with their approval, without any legal sanction or due process.”[3] These strikes are conducted with a formal American legal military process and frequently rely on intelligence assessments that may lack transparency or accountability. At times however, the host county does not give the American military consent to conduct RPA strikes within their borders.[4] For example, during the War on Terror, to name a few countries, both Afghanistan (after 2021) and Iraq did not give the United States authority to conduct drone strikes in their countries.[5] The use of drone strikes in these countries without the permission of their respective governments is a violation of sovereignty. RPA strikes are categorized as either personality strikes, which target identified individuals such as high-ranking terrorist leaders, or signature strikes, which rely on patterns of life to infer association with terrorist activity.[6]

While much of the academic literature focuses on the legality or ethics of RPAs, far less attention is paid to the daily realities of those who operate them. RPA pilots and sensor operators must toggle between high-stakes military operations and the mundanity of civilian life as there are no boundaries between life and work. This duality can create psychological tensions, especially when operators witness or contribute to lethal outcomes from afar. The concept of moral injury, described as psychological, behavioral, or spiritual harm resulting from actions that violate one’s ethical beliefs, offers a useful framework for understanding the long-term impact of RPA warfare on its human military operators.[7] Therefore, the research questions this paper seeks to ask are, “How does remotely operating the MQ-9 Reaper in U.S. targeted killing operations influence the operational practices, psychological well-being, and moral decision-making of RPA pilots and sensor operators, and how can military chaplains best support this population given the distinctive demands of remote warfare?”

This paper draws on empirical research, military testimonials, and scholarly literature to examine the human dimension of RPA warfare. It also considers institutional support systems, such as the Human Performance Team, designed to mitigate psychological stress among operators. By amplifying the voices of these often-overlooked service members, this study aims to illuminate the hidden human costs of remote warfare.[8]

Division of Labor: RPA Pilots and Sensor Operators

The MQ-9 Reaper, developed by General Atomics, is among the most advanced remotely piloted aircraft in the U.S. military arsenal. It performs a variety of functions, including intelligence gathering, surveillance, reconnaissance (ISR), and precision strikes. With a base model around thirty-two million dollars, this RPA is equipped with high-resolution sensors, laser-guided missiles, and extended flight endurance.[9] They typically carry at least four Hellfire missiles and can carry up to four 500Lb bombs when needed. The Reaper is designed to identify and eliminate targets with minimal risk to American personnel. Yet, beneath this technological sophistication lies a complex human process that drives every decision and engagement.

RPAs are stationed at U.S. military bases around the world. They are initially launched locally and then handed over to pilots based in the United States who conduct mission operations remotely. After the mission concludes, control is transferred back to local pilots for landing and maintenance. All post-mission support, including debriefing and aircraft servicing, occurs at the RPA’s host base. Unlike traditional deployments, stateside RPA pilots and sensor operators typically return home to their families after each shift although RPA pilots and sensor operators are also deployed.

Within the structure of RPA operations, RPA pilots and sensor operators function as a coordinated team. Although both operate on American military bases, often on U.S. soil, their responsibilities are distinct, and each plays a critical role in the success and execution of missions. Understanding this division of labor sheds light on the collaborative nature of RPA warfare and the psychological burdens uniquely borne by each position.

RPA pilots are commissioned officers who typically possess a pilot’s license and have completed extensive training in aviation theory, air navigation, meteorology, aircraft operations, and combat tactics. They are responsible for navigating the aircraft, maintaining flight control, and executing weapons releases upon authorization. While the pilot operates from a remote location, the stakes remain high: a single miscalculation in trajectory or targeting can lead to mission failure, collateral damage, or unintended casualties.[10]

Sensor operators, in contrast, often enter the military directly out of high school, some as young as 17 years old. They undergo basic military training followed by specialized instruction in sensor operation. While not co-pilots, sensor operators manage the aircraft’s advanced targeting systems, including high-definition cameras, infrared sensors, and laser designators. In many cases, it is the sensor operator who guides the weapon, whether a Hellfire missile or 500-pound bomb to its intended target.[11]

Both pilots and sensor operators participate in pattern-of-life analysis, monitoring individuals over extended periods to assess potential militant activity. They are deeply involved in the intelligence cycle and must confirm targets during engagements. This sustained visual contact fosters a sense of familiarity with both targets and the friendly forces they support, often without those individuals ever knowing. The nature of RPA warfare is intensely visual, exposing operators to live feeds of strikes and their aftermath. After a strike, the pilot must fly back to assess the results, meaning both the pilot and sensor operator are exposed to the carnage not once, but twice. In addition, pilots and sensor operators can zoom in to within roughly ten feet of the strike site, giving them an unmistakably clear, close-up view of bodies, debris, and the immediate aftermath of destruction.

The interdependence between pilots and sensor operators is foundational to successful RPA missions. Their communication must be seamless, particularly in high-pressure situations where decision windows are brief. Although pilots and sensor operators have mentioned in interviews that they often have a preferred colleague to work with, their assigned partner on any given day is unpredictable. Despite this close collaboration, research suggests differences in cognitive workload and emotional impact between the two roles. Sensor operators, due to their close visual engagement and direct control over delivery of weapons, often experience heightened psychological stress. Their age and relative inexperience may further increase vulnerability to moral injury and emotional fatigue. These realities underscore the need for tailored support strategies that address the specific psychological challenges associated with each role in RPA operations.

Methodology

This study employs a qualitative, interdisciplinary research design to examine the operational structure and psychological impact of MQ-9 Reaper missions on U.S. RPA pilots and sensor operators. It draws on empirical research, military testimonials, and scholarly literature to analyze the human dimension of targeted killing operations with particular attention to sustained visual exposure, division of labor, and post-strike assessment procedures that shape pilot and sensor operator experiences. Data consists of publicly available academic studies both in books and articles, published interviews, interviews at U.S. Air Force Bases (including Holloman, Creech, Shaw, and Ramstein), and institutional materials, including documentation of support systems such as the Human Performance Team at Creech Air Force Base.

The data has been analyzed to identify recurring patterns and themes related to moral injury, psychological strain, and institutional response. In addition, the author’s personal observations have also been included and are based on lengthy interviews with pilots, sensor operators, chaplains, psychologists, medical doctors, commanders, and other relevant personnel. Potential biases include reliance on secondary sources, publication bias toward controversial or adverse outcomes, and limited access to classified operational data as the author does not have a security clearance. To mitigate these risks, the study triangulates across multiple source types, avoids causal prevalence claims, and explicitly distinguishes documented findings from normative interpretation.

Moral Injury Among RPA Pilots and Sensor Operators

Moral injury occurs when individuals commit, fail to prevent, or witness actions that violate their deeply held moral or ethical beliefs. Molendijk et al. found that moral injury occurs in five to twenty-five percent of the soldier population.[12] These violations may stem from acts of commission, such as participating in a strike that unintentionally results in civilian casualties, or acts of omission, such as being unable to intervene in morally distressing situations. From a military branch perspective, the term is controversial and as an example the Marines do not use the term “moral injury” but instead rely on the term “inner conflict.”[13] The purpose is to ensure there is no suggestion that the soldier acted unlawfully or violated the laws of war.

Think tanks such as The Bureau of Investigative Journalism[14] or The Long War Journal[15] have long chronicled the number of civilians killed by drones and the effects of collateral damage. RPA pilots and sensor operators may witness acts such as sexual assault, bestiality, domestic violence, or other forms of abuse that are culturally normalized in the regions they monitor yet deeply disturbing through their own moral lens.[16] This exposure, combined with the inability to act, can lead to profound feelings of helplessness, guilt, and despair. Moral injury is often compounded by a sense of betrayal, whether by commanding officers, military institutions, or political leadership, and may manifest as shame, social withdrawal, and a fractured sense of identity.

Adam Henschke’s “Moral Risk, Moral Injury, and Institutional Responsibility,” argues that there are two types of moral injury.[17] The first is the psychological trauma that exists due to the contradiction of one’s actions and their morality. The second is the numbing or desensitizing of an individual’s moral character, where a person becomes hardened to the daily reality of killing and morally corruptible actions that they are expected to take. Henschke’s work can be extended to drone warfare, where AI-assisted decision-making may contribute to moral injury by placing human operators in ethically problematic situations or by creating situations that later lead to profound distress.

Sassan Gholiagha points out that the language of the target, the language of the body, and the language of dehumanization all contribute to the desensitization of killing in RPA targeted killing.[18] The human being that is sought simply becomes the “target” and the humanity is removed from that individual. The body itself simply carries a certain signature and carries no greater implication than an object that needs to be destroyed on a screen. The language of dehumanization shows up in terms such as “bug splat” or “squirter,” which are used to describe recently deceased targets or human beings running from the explosions.

Paul Lushenko and Srinjoy Bose found in their survey of 1500 Americans in November, 2025 that America public support for drone warfare is tenuous and depends on maintaining a delicate balance between technological capability and human oversight.[19] The findings suggest that preserving a “man in the loop” in drone operations is not only ethically important but also politically beneficial, reinforcing the need for policies like the Defense Department’s Directive 3000.09 on Autonomy in Weapon Systems as AI-enabled systems become more prevalent. Support is also shaped by how drone targets are framed: while dehumanizing rhetoric can boost short-term public approval by heightening emotions, it risks weakening democratic accountability and expanding perceptions of legitimate targets over time. The assumption that fully autonomous drones will increase public acceptance for drone warfare is misguided; instead, their use may provoke political backlash and undermine perceived legitimacy. These insights highlight the need for policymakers to prioritize responsible AI initiatives and ensure transparency, accountability, and human involvement in drone operations.

In the context of RPA operations, several features heighten susceptibility to moral injury. First, the clarity of RPA imagery can humanize targets in a way not traditionally experienced in warfare for objects visible within ten feet. Operators may observe individuals for days or weeks before receiving an order to strike, leading to emotional attachments or heightened empathy. They may have developed a relationship with the target’s family or children. Second, the lack of physical risk and the sterile television environment of the control room may intensify internal awareness: operators are acutely aware of the consequences of their actions but are often detached from the physical environment of war. The close-range exposure to combat, along with the follow-up to confirm the mission was accomplished, can result in vivid, firsthand visual images.

The constant death and destruction that RPA pilots and sensor operators experience can lead to several problems. Constantly witnessing or performing violence can shift a person’s mind and body into a long-term “threat mode,” where stress systems stay activated and everyday life feels unsafe. Over time, this can lead to trauma symptoms such as intrusive memories, nightmares, avoidance, hypervigilance, irritability, sleep problems, and emotional numbness or detachment. This can also lead to the opposite effect, which is desensitization where empathy and emotional response can affect relationships. Repeated exposure like that which sensor operators and pilots may experience can increase depression, anxiety, substance abuse, and difficulties with trust and connection. People that commit violence, particularly when it conflicts with their values, can experience guilt, shame, or can learn to justify and normalize harm. Violence and aggression will become more acceptable.

Some Things to Consider When Supportive This Unique Population of Servicemembers

In a relevant study, Paul Lushenko and Keith Carter[20] examined how military chaplains in Active Duty, National Guard, and Reserve components perceive the legitimacy of US drone strikes. Through a survey with a small sample of US Army chaplains, they found that while chaplains’ assessments of legitimate strikes covary, they can also deviate. Overall, chaplains disregard the legality of strikes in undeclared theaters of operations. Even though a chaplain may perceive a strike as legitimate, they may support it less if it is in an undeclared theater. Lushenko and Carter help show how chaplains understand legitimacy and how those views may shape how they serve the soldiers and communities in their care.

RPA pilots and sensor operators represent a distinct and highly specialized segment of the military’s war-fighting force. As such, their unique operational environment, challenges, and experiences require tailored approaches to support and development. To effectively serve this population, the military must implement support measures that are specifically designed to address their needs, both professionally and personally, rather than relying solely on traditional frameworks used for manned aviation communities. Pilots and sensor operators see graphic images of the destruction that they create and then must return to the scene to make sure that the job was completed, thus exposing them to the trauma again.

A basic understanding of key technical parameters is essential to fully grasp how RPA operations function. One of the most critical factors is the inherent delay in responsiveness; there is typically a two-second lag between a pilot’s input and the aircraft’s reaction. This delay can significantly impact mission performance, particularly in high-stakes situations, and requires both pilots and sensor operators to develop precise timing and coordination. The same latency also applies to weapons deployment, increasing the risk of collateral damage if not carefully accounted for.

RPA crews typically work eight-hour shifts; however, these often extend well beyond scheduled hours due to required tasks such as debriefings, mission reviews, and administrative responsibilities. Additionally, they frequently rotate through shifting schedules, which can disrupt sleep patterns and place considerable strain on their personal lives. These shift times have reportedly improved tremendously since the early years of RPA use. In addition, RPA pilots and sensor operators immediately return to civilian life if they are stateside and very few people understand their jobs and its implications.

There are several things to consider when identifying measures that can support this unique population of servicemembers. Although the stigma has started to fade, a culture of inferiority has long existed between RPA pilots and sensor operators and traditional fighter pilots. RPA operators have often been viewed as less prestigious than traditional pilots, reflected in everything from the lack of military medals to persistent comparisons with video gaming. The Air Force has even used video game tournaments to recruit RPA pilots and sensor operators.[21] This perception has contributed to a diminished sense of professional identity within the RPA community. This perception has gradually improved over time, as RPA pilots and sensor operators now actively choose this career path rather than being assigned to it after being disqualified from traditional pilot roles for whatever reason. However, bases like Creech are now primarily used for RPA operations which can leave the RPA population feeling isolated and misunderstood.

In addition, the battlefield vision of RPA pilots and sensor operators can lead to more psychological problems due to the close-ups visuals of carnage they view and the need to return to the battle site to ensure that the mission was completed. The pilot and sensor operator can observe activities from ten feet through their camera systems. They may see bodies blown apart or structures blown up with graphic images that fighter pilots are never able to see. If moral injury occurs, this second viewing can create or compound the problem.

Also, the abrupt transition between operational duties to domestic life can be traumatic. RPA personnel may execute lethal missions in the morning and attend a child’s soccer game in the evening, with no opportunity for decompression or psychological recalibration. This duality of roles-servicemember and civilian, can undermine mental well-being and blur the boundaries of responsibility and morality. There is no work-life balance for these servicemembers.

Military institutions have begun acknowledging the psychological burden of RPA operations. At Creech Air Force Base, for example, the Human Performance Team offers guidance from chaplains, psychologists, and behavioral health specialists to pilots and sensor operators. In an interview with Lt. Col. DS, a psychiatrist, she explains the purpose of preparing military personnel with presentations and literature on the psychology of killing. If personnel are taught how to view these issues ahead of time, they can have a better idea of how to deal with psychological problems when they occur. Just War Theory and an understanding of why a strike occurs is part of these presentations. This presentation is not currently required but is often requested by units as air crews change out.[22]

To effectively serve this population, it is essential to understand the mission from the perspective of those who carry it out. Initiating open conversations with these individuals can foster cultural understanding and encourage ongoing dialogue. Support staff should also familiarize themselves with relevant literature, such as On Killing Remotely[23] and Reaper Force[24]. Furthermore, the stigma around seeking mental health services in the military has decreased; when commanders actively support the use of these services, their units are more likely to follow suit. Since soldiers are afraid of losing security clearance, they may fear talking about their mission to support staff. A way to ameliorate this concern would be to have pertinent support care providers be “read on” to the specifics of the mission when they come into a unit.[25]

From a chaplain perspective, it is recommended that chaplains “not be too theological in nature and to use more emotional intelligence in providing support to crews.” Chaplains need to create presence, empathy, and cultural neutrality in their treatment of servicemembers. Most of the population in the military is much younger and does not identify with any religious denomination. Chaplains should be accessible, avoid imposing religious frameworks, refrain from judgment, and participate in training simulators to understand the operators’ environment. Chaplains should be around and regularly visible by the personnel in their unit.[26]

Operational challenges are also compounded by a lack of public understanding and recognition. RPA crews often report a sense of anonymity or dismissal compared to their counterparts in traditional combat roles. Most people do not even know that RPA operations exist, let alone how they work. This sense of invisibility may intensify feelings of moral ambiguity, especially when the outcome of a mission leads to unintended civilian casualties or when public narratives cast RPA warfare as “clean” or “surgical,” a portrayal that oversimplifies the emotional and operational complexities involved.

Most importantly, the psychological toll on the sensor operator who guides the weapon to the target cannot be underestimated. Because many are young and inexperienced, this group may be especially vulnerable to moral injury, PTSD, and other psychological challenges. Newly assigned sensor operators should be closely monitored and regularly checked in with to support their mental health. The author also recommends regular structured counseling and mentoring beginning as soon as they arrive on base to reduce the risk of long-term psychological trauma.

Conclusion

As RPA warfare becomes more deeply integrated into U.S. military doctrine, the distinct contributions and psychological burdens of both pilots and sensor operators warrant further attention. Equitable recognition, appropriate training, and tailored mental health resources are essential to ensuring the sustainability and ethical integrity of RPA operations.

Moreover, the psychological distancing enabled by RPA operations poses ethical questions about the nature of violence. Unlike traditional combat, where soldiers face reciprocal risk, RPA pilots are physically removed from harm. This asymmetry introduces concerns about moral disengagement, the process by which individuals rationalize or emotionally detach from harmful actions. While RPA operators may experience moral injury and psychological strain, the institutional framework of RPA warfare often abstracts violence into data streams and strategic outcomes, thereby minimizing public scrutiny and moral reckoning.

RPA warfare also raises concerns about precedent. As other nations and non-state actors acquire similar technologies, the normative standards set by the U.S. will influence global behavior. Without robust legal frameworks and transparent oversight, the proliferation of targeted killing capabilities may erode international norms and destabilize global security.

The cumulative effect of prolonged moral injury can impair not only individual mental health but also unit cohesion and long-term military readiness. As RPA warfare becomes an entrenched feature of modern combat, a deeper understanding of moral injury, and how to prevent or remediate it, will be vital to safeguarding the psychological welfare of those behind the controls.

The integration of remotely piloted aircraft into U.S. military operations has fundamentally altered the landscape of modern warfare. While RPAs offer strategic advantages such as operational reach, precision, and reduced risk to American personnel, they also introduce unfamiliar problems for American service men and women. This paper examined the distinct roles of RPA pilots and sensor operators, the operational dynamics of targeted killings, and the moral injury that may result from such missions.


  1. Lt. Col Johnny Duray, “Remotely Piloted Aircraft: Implications for Future Warfare,” Air and Space Forces.com, February 1, 2020, https://www.airandspaceforces.com/article/remotely-piloted-aircraft-implications-for-future-warfare/

  2. Christine Sixta Rinehart, “Targeted Killing: The Constitutionality of Killing US Citizens,” in The Official Record, ed. Peter Finn, and Robert Ledger (Manchester University Press, 2024).

  3. Elizabeth Mohn, “Extra-judicial killing,” EBSCO, 2025, https://www.ebsco.com/research-starters/law/extrajudicial-killing-state-killing

  4. Christine Sixta Rinehart, Drones and Targeted Killing in the Middle East and North Africa (Lanham, MD: Lexington, 2018).

  5. See Al Jazeera, “US admits it did not give Iraq notice of strikes despite earlier claims,” February 6, 2024, https://www.aljazeera.com/news/2024/2/6/us-admits-it-did-not-give-iraq-notice-of-strikes-despite-earlier-claims#:~:text=Iraq condemned the strikes%2C which,security ties between the sides and Al Jazeera, “Taliban: ‘Consequences’ if US drones enter Afghan airspace,” September 29, 2021, https://www.aljazeera.com/news/2021/9/29/taliban-consequences-if-us-continues-to-fly-drones-in-airspace#:~:text=The Taliban has accused the,to fly drones over Afghanistan.&text=The Taliban has warned of,of these drones in Afghanistan”

  6. Jack McDonald, Enemies Known and Unknown (New York: Oxford University Press, 2017).

  7. Joseph O. Chapa, Is Remote Warfare Moral? Weighing Issues of Life and Death from 7,000 Miles (Public Affairs, 2022).

  8. The author’s firsthand data come from extensive research, including semi-structured interviews with RPA pilots and sensor operators at Creech, Holloman, and Ramstein Air Force Bases that will be used throughout this paper.

  9. General Atomics, “MQ-9A Reaper,” 2025, https://www.ga-asi.com/remotely-piloted-aircraft/mq-9a.

  10. The United States Air Force, “Remotely Piloted Aircraft (RPA) Pilot,” 2025, https://www.airforce.com/careers/aviation-and-flight/pilot/remotely-piloted-aircraft-pilot.

  11. The United States Air Force, “Remotely Piloted Aircraft (RPA) Sensor Operator,” 2025, https://www.airforce.com/careers/aviation-and-flight/remotely-piloted-aircraft-rpa-sensor-operator.

  12. Tine Molendijk, Willemijn Verkoren, Annelieke Drogendijk et al., “Contextual Dimensions of Moral Injury: An Interdisciplinary Review,” Military Psychology,34, no.6 (2022):742-753. doi: 10.1080/08995605.2022.2035643

  13. Tine Molendijk, Willemijn Verkoren, Annelieke Drogendijk et al., “Contextual Dimensions of Moral Injury: An Interdisciplinary Review,” Military Psychology,34, no.6 (2022):742-753. doi: 10.1080/08995605.2022.2035643

  14. The Bureau of Investigative Journalism, 2025, https://www.thebureauinvestigates.com/.

  15. The Long War Journal, 2025, https://www.longwarjournal.org/.

  16. Matt J. Martin and Charles W. Sasser, Predator: The Remote-Control Air War over Iraq and Afghanistan: A Pilot’s Story (Minneapolis, MN: Zenith, 2010).

  17. Adam Henschke, “Moral Risk, Moral Injury, and Institutional Responsibility,” International Journal of Intelligence and CounterIntelligence, 38, no. 4 (2024):1231–1248. https://doi.org/10.1080/08850607.2024.2382031

  18. Sassan Gholiagha, “Individualized and Yet Dehumanized? Targeted Killing via Drones,” Behemoth A Journal on Civilisation 8, no. 2 (2015): 128-53. 10.6094/behemoth.2015.8.2.873

  19. Paul Lushenko and Srinjoy Bose, “How dehumanizing language, video images, and human oversight affect public opinion on drone warfare,” The Bulletin of the Atomic Scientists, February 3, 2026, https://thebulletin.org/2026/02/how-dehumanizing-language-video-images-and-human-oversight-affect-public-opinion-on-drone-warfare/.

  20. Paul Lushenko and Keith L. Carter, “Chaplains and the Legitimacy of Drone Warfare: Experimental Evidence from the US Army,” Politics and Religion 18 (2025): 351–82.

  21. Blake Stilwell, “The Air Force Wants You to Play Video Games in the Name of National Security,” Military.com, September 15, 2023, https://www.military.com/off-duty/games/2023/09/15/air-force-wants-you-play-video-games-name-of-national-security.html.

  22. Lt. Col. DS, interview at Creech Airforce Base, January 8, 2025.

  23. Wayne Phelps, On Killing Remotely (New York: Little, Brown, and Company, 2021).

  24. Peter Lee, Reaper Force-Inside Britain’s Drone Wars (London: John Blake, 2018).

  25. Lt. Col. DS, interview at Creech Airforce Base, January 8, 2025.

  26. Chaplain D, interview at Creech Airforce Base, January 8, 2025.