The Algorithm That Kills Random People all Over the World
Essay 77.5 of 100: Unvarnished History
On October 14, 2011, a CIA drone strike killed Abdulrahman al-Awlaki in Yemen. He was sixteen years old. He was an American citizen. He had gone to Yemen to find his father, Anwar al-Awlaki, also an American citizen, who had been killed by a CIA drone strike two weeks earlier.
Abdulrahman was not charged with a crime. He was not tried. He was not given the opportunity to appear before a judge, present a defense, or appeal a sentence. He was eating dinner at an outdoor restaurant with his teenage cousin when the missile hit. His cousin also died.
When the Obama administration was asked to justify the killing of a sixteen-year-old American citizen, then-Press Secretary Robert Gibbs said Abdulrahman should have “had a more responsible father.” This is not a one-party problem. It’s a rich versus everyone else problem. Rich people can buy whatever world they want. You and I cannot.
No one was charged. No one was fired. The program that killed the U.S. teenager continues. All because he had a scary sounding name. All because the people who extrajudicially kill people knew they would not be held to account. All because war never changes and war is a racket.
The Legal Void
On September 18, 2001, seven days after the World Trade Center collapsed, Congress passed the Authorization for Use of Military Force. The AUMF authorized the president to use “all necessary and appropriate force” against those who “planned, authorized, committed, or aided” the September 11 attacks, or who “harbored such organizations or persons.”
That law has never been repealed. It has never been significantly amended. It has been used to justify military action in at least seven countries — Afghanistan, Iraq, Syria, Yemen, Somalia, Libya, and Niger — and it was used to authorize the killing of Abdulrahman al-Awlaki, an American teenager eating dinner, under a legal theory that has never been reviewed by any court.
The theory is this: the president has the authority to order the killing of any person determined to be an “imminent threat” to the United States, anywhere on earth, without judicial review, without charge, without trial, and without accountability for error. This determination does not require evidence that would survive a courtroom. It requires inclusion on a document called the disposition matrix — a classified database of human beings the United States government has decided may be killed.
In practice, the president’s role in individual targeting decisions has been progressively replaced by algorithmic recommendation. The disposition matrix does not require presidential review of individual strikes. It requires a bureaucratic process that a private company’s software increasingly drives.
The disposition matrix was never authorized by Congress. It was never reviewed by the Supreme Court. It was created administratively, by executive branch officials, and it is maintained by a bureaucratic process that has no public accountability, no external oversight, and no mechanism by which the people on it can know they are on it, contest their inclusion, or appeal the determination.
It is, in the most precise legal sense, a kill list. And for the past two decades, private companies have been paid billions of dollars to build and operate the infrastructure that makes it run.
The executive is the problem. The legislative is the problem. The judicial is the problem.
The Machine
Palantir Technologies was founded in 2003 by Peter Thiel, Alex Karp, and three others. Its early funding came partly from the CIA’s venture capital arm, In-Q-Tel. Its earliest government contracts were with the CIA and the Department of Defense. Its software integrates and analyzes data from multiple intelligence sources to produce targeting recommendations — assessments of who is where, what they are doing, and what threat they represent.
In 2025, Palantir’s annual revenue from government contracts exceeded two billion dollars. The company’s stock price has made its CEO, Alex Karp, worth approximately seven billion dollars.
On a February 2025 investor call, Karp described Palantir’s mission as follows: “Palantir is here to disrupt and make the institutions we partner with the very best in the world and, when it’s necessary, to scare enemies and on occasion kill them.”
He said this with a smile.
Palantir is the company that provides the data integration and targeting infrastructure that populates the disposition matrix. This is the company whose software produces the recommendations that end in missiles fired at outdoor restaurants in Yemen.
Karp has a PhD in social theory. His doctoral dissertation, published in 2002, is titled Aggression in the Life-World. Its opening sentence: “This work began with the observation that many statements have the effect of relieving unconscious drives, not in spite, but because, of the fact that they are blatantly irrational.”
He wrote his doctorate on the mechanism by which irrational public utterances relieve unconscious aggressive drives. He is now performing that mechanism on investor calls, with seven billion dollars and a classified government contract, and he has built a system that performs the underlying impulse at industrial scale.
The smile on the earnings call is not incidental. It is diagnostic.
When Algorithms Murder on Behalf of the Rich
The United States government acknowledges that drone strikes kill civilians. The Bureau of Investigative Journalismhas documented between 910 and 2,200 confirmed civilian deaths from US drone strikes in Pakistan, Yemen, Somalia, and Afghanistan since 2004. The actual number is higher — many strikes occur in areas where independent verification is impossible, and the US government classifies all military-age males in a strike zone as combatants unless posthumously proven otherwise.
This classification methodology has a name: signature strikes. A signature strike does not require that the target be identified as a specific individual. It requires that the target’s behavior pattern — their “signature” — match a profile associated with militant activity. The profile is generated by algorithmic analysis of surveillance data. The determination that a behavioral pattern is a “signature” of militancy is made by analysts working with that data.
The analysts are working with Palantir’s software. Biometrics were just coming online in 2005 when I arrived in Baghdad. Two decades is five lifetimes of algorithms pulling the trigger. Eliminating humans has been outsourced to heuristics and Karp’s pathologies.
When the signature strike kills a wedding party — which has happened, documented — no one is prosecuted. When it kills a farmer whose phone was in proximity to a suspected militant’s phone — which is the actual evidentiary standard in some cases — no one is fired. When it kills a sixteen-year-old American citizen eating dinner — the family has no legal recourse. The 2014 lawsuit brought by the al-Awlaki family was dismissed. The court found that the political question doctrine barred judicial review of executive targeting decisions.
There is no court in which you can challenge your inclusion on the disposition matrix. There is no court in which the families of those killed by signature strikes can seek accountability. There is no mechanism by which the algorithm’s errors are corrected before the next strike.
The algorithm is not held accountable. The company that built it is not held accountable. The officials who approved the strikes are not held accountable. The strikes continue.
The Domestic Pipeline
The same infrastructure does not stay overseas.
Palantir holds contracts with Immigration and Customs Enforcement for a system called FALCON — the Federated Analytical Case Management system. FALCON integrates data from multiple government databases to track, identify, and locate undocumented immigrants in the United States. It is the same data integration and pattern-matching architecture used for targeting overseas, applied domestically.
The predictive policing software Palantir has sold to dozens of American police departments operates on the same logic as signature strikes: behavioral pattern analysis to identify individuals as threats before they have committed a documented crime. The person identified as a threat by the algorithm does not know they have been identified. They cannot contest the identification. They have no legal recourse until the identification results in an arrest.
The architecture is identical. The legal framework is different — domestic law still requires some form of judicial oversight before detention, though that oversight has been systematically eroded. But the underlying logic is the same: an algorithm identifies a target, a human being takes action against the target, the algorithm is not accountable for the outcome.
Essay 84 in this series covers the militarization of American police. The Pentagon’s 1033 program brought military equipment from foreign battlefields to American streets. Palantir brought the targeting logic. The convergence is not accidental. It is the domestic expression of the same system that operates overseas — the system in which algorithmic determination of threat status replaces judicial determination of guilt, and in which private companies profit from the determination.
The Historical Record
This is not new. The architecture is new. The principle is not.
The Doctrine of Discovery established in 1493 that certain people — those outside the boundaries of Christian European civilization — had no legal standing. Their lives could be taken, their land could be seized, their labor could be extracted, without accountability under any law that the taking, seizing, or extracting power recognized as binding. The legal void that permitted the encomienda and the requerimiento is the same legal void that permits the disposition matrix.
COINTELPRO, the FBI’s program to destroy Black civil rights organizations, the American Indian Movement, and socialist political groups, operated on the same logic: a government agency determined, without judicial oversight, that certain individuals and organizations were threats to national security, and then took action to neutralize those threats — including surveillance, infiltration, defamation, and in the case of Fred Hampton, assassination. No charges. No trial. No accountability.
The Church Committee, which investigated COINTELPRO in 1975, concluded that the program represented “a sophisticated vigilante operation” conducted by the government against its own citizens. The reforms the committee recommended — the Foreign Intelligence Surveillance Act, the prohibition on assassination of foreign leaders, the oversight mechanisms — have been progressively dismantled since September 11, 2001.
What Palantir represents is the privatization and automation of the same logic. The government does not need to maintain a secret program to neutralize its designated enemies. It can contract with a private company to build the infrastructure that identifies them, and then use a classified legal framework to authorize action against them, and then classify the results so that no court can review them, and then renew the contract.
The private company profits regardless of whether the identified target was actually a threat. The private company profits regardless of whether the strike killed the right person. The private company profits regardless of how many wedding parties the algorithm identifies as militant gatherings.
This is the cui bono of the disposition matrix. The question is not whether the United States has enemies. The question is who benefits from a system in which the determination of who is an enemy is made by a private algorithm, acted upon without judicial review, and insulated from accountability by classification.
Alex Karp benefits. Palantir shareholders benefit. The defense contracting ecosystem benefits. The officials who authorize the strikes benefit from the insulation of the classification system.
The people on the list do not know they are on the list. Their families have no legal recourse. The algorithm does not appear in court. Teenage murders are not mourned by the algorithm.
The Unanswered Question
No democratic body has debated whether algorithmic targeting should be legal. No court has reviewed whether the disposition matrix is constitutional. No legislature has established standards for what evidence is required before a human being can be added to a kill list. No oversight mechanism exists to determine whether the private companies building the infrastructure are operating within any ethical constraints whatsoever.
These are not oversights. They are design features. The classification system exists to prevent the questions from being asked in public. The legal framework exists to prevent the questions from being answered in court. The contract structure exists to ensure the private companies face no accountability for the outcomes their software produces.
A sixteen-year-old American citizen was eating dinner in Yemen. The algorithm said he was there. A missile was fired. He died. You do not know if you are on the list. Neither do I.
Government officials said a murdered kid with a weird name and an American passport should have had a more responsible father.
No one was charged. No one was fired. The contract was renewed. Your teenager is fair game.
This is the system. Not the aberration. Not the exception. The system, operating as designed, producing the outcomes it was designed to produce, for the benefit of the people who built it.
The question the republic has not yet been willing to answer is this: at what point does a government that kills its own citizens without charge, trial, or accountability, using software built by a private company whose CEO smiles about killing on investor calls, cease to be a republic in any meaningful sense?
We are somewhere past that point. The answer to the question is already behind us.
The record should show that someone noticed.
The record shows that you and I and anyone who dissents is on the list. No judicial process = no rule of law.
Previous: Essay 77 — The Private Military Complex: Blackwater and the Contractors Next: Essay 78 — Abu Ghraib and the Chain of Command Out of order intentionally, but still 100 in toto.
Penfist is a combat veteran who served with the Army National Guard in Iraq and Afghanistan, a Marine-trained combat correspondent (MOS 4341), and the author of Dispatches from a Dying Empire at dyingempire.org. He grew up in Bangladesh and Haiti, was raised by Mennonite and Amish parents, and is a naturalized U.S. citizen.


