In 2014, the former director of both the CIA and NSA proclaimed that
"we kill people based on metadata." Now, a new examination of previously
published Snowden documents suggests that many of those people may have
Last year, The Intercept published documents detailing the NSA’s SKYNET
programme. According to the documents, SKYNET engages in mass
surveillance of Pakistan’s mobile phone network, and then uses a machine learning algorithm on the cellular network metadata of 55 million people to try and rate each person’s likelihood of being a terrorist.
Patrick Ball—a data scientist and the director of research at the Human Rights Data Analysis Group—who
has previously given expert testimony before war crimes tribunals,
described the NSA’s methods as “ridiculously optimistic” and “completely
bullshit.” A flaw in how the NSA trains SKYNET’s machine learning
algorithm to analyse cellular metadata, Ball told Ars, makes the results scientifically unsound.
Somewhere between 2,500 and 4,000 people have been killed by
drone strikes in Pakistan since 2004, and most of them were classified
by the US government as “extremists,” the Bureau of Investigative
Journalism reported. Based on the classification date of “20070108” on one of the SKYNET slide decks
(which themselves appear to date from 2011 and 2012), the machine
learning program may have been in development as early as 2007.
In the years that have followed, thousands of innocent people in
Pakistan may have been mislabelled as terrorists by that “scientifically
unsound” algorithm, possibly resulting in their untimely demise.
SKYNET works like a typical modern Big Data
business application. The program collects metadata and stores it on
NSA cloud servers, extracts relevant information, and then applies
machine learning to identify leads for a targeted campaign. Except
instead of trying to sell the targets something, this campaign, given
the overall business focus of the US government in Pakistan, likely
involves another branch of the US government—the CIA or military—that
executes their “Find-Fix-Finish” strategy using Predator drones and on-the-ground death squads.
The program, the slides tell us, is based on the assumption that
the behaviour of terrorists differs significantly from that of
ordinary citizens with respect to some of these properties. However, as
The Intercept’s exposé last year made clear, the
highest rated target according to this machine learning program was
Ahmad Zaidan, Al-Jazeera’s long-time bureau chief in Islamabad.
As The Intercept reported, Zaidan frequently travels to regions
with known terrorist activity in order to interview insurgents and
report the news. But rather than questioning the machine learning that
produced such a bizarre result, the NSA engineers behind the algorithm
instead trumpeted Zaidan as an example of a SKYNET success in their
in-house presentation, including a slide that labelled Zaidan as a
“MEMBER OF AL-QA’IDA.”
If 50 percent of the false negatives (actual “terrorists”) are
allowed to survive, the NSA’s false positive rate of 0.18 percent would
still mean thousands of innocents misclassified as “terrorists” and
However, even 0.008 percent of the Pakistani population still
corresponds to 15,000 people potentially being misclassified as
“terrorists” and targeted by the military—not to mention innocent
bystanders or first responders who happen to get in the way.
Security guru Bruce Schneier agreed. “Government uses of big data are inherently different from corporate uses,” he
told Ars. “The accuracy requirements mean that the same technology
doesn’t work. If Google makes a mistake, people see an ad for a car they
don’t want to buy. If the government makes a mistake, they kill
Algorithms increasingly rule our lives. It’s a small step from
applying SKYNET logic to look for “terrorists” in Pakistan to applying
the same logic domestically to look for “drug dealers” or “protesters”
or just people who disagree with the state. Killing people “based on
metadata,” as Hayden said, is easy to ignore when it happens far away in
a foreign land. But what happens when SKYNET gets turned on us—assuming
it hasn’t been already?