The Spyware Used in Intimate Partner Violence
Rahul Chatterjee
∗
, Periwinkle Doerfler
†
, Hadas Orgad
‡
, Sam Havron
§
, Jackeline Palmer
¶
, Diana Freed
∗
,
Karen Levy
§
, Nicola Dell
∗
, Damon McCoy
†
, Thomas Ristenpart
∗
∗
Cornell Tech
†
New York University
‡
Technion
§
Cornell University
¶
Hunter College
Abstract—Survivors of intimate partner violence increasingly
report that abusers install spyware on devices to track their
location, monitor communications, and cause emotional and
physical harm. To date there has been only cursory investigation
into the spyware used in such intimate partner surveillance (IPS).
We provide the first in-depth study of the IPS spyware ecosystem.
We design, implement, and evaluate a measurement pipeline that
combines web and app store crawling with machine learning to
find and label apps that are potentially dangerous in IPS contexts.
Ultimately we identify several hundred such IPS-relevant apps.
While we find dozens of overt spyware tools, the majority are
“dual-use” apps — they have a legitimate purpose (e.g., child
safety or anti-theft), but are easily and effectively repurposed
for spying on a partner. We document that a wealth of online
resources are available to educate abusers about exploiting apps
for IPS. We also show how some dual-use app developers are
encouraging their use in IPS via advertisements, blogs, and
customer support services. We analyze existing anti-virus and
anti-spyware tools, which universally fail to identify dual-use
apps as a threat.
I. I NTRODUCTION
Intimate partner violence (IPV) affects roughly one-third of
all women and one-sixth of all men in the United States [54].
Increasingly, digital technologies play a key role in IPV
situations, as abusers exploit them to exert control over their
victims. Among the most alarming tools used in IPS are
spyware apps, which abusers install on survivors’ phones in or-
der to surreptitiously monitor their communications, location,
and other data. IPV survivors [23, 29, 46], the professionals
who assist them [29, 58], and the media [9, 22, 37] report
that spyware is a growing threat to the security and safety
of survivors. In the most extreme cases, intimate partner
surveillance (IPS) can lead to physical confrontation, violence,
and even murder [10, 18].
The definition of “spyware” is a murky one. Some apps are
overtly branded for surreptitious monitoring, like FlexiSpy [2]
and mSpy [6]. But survivors and professionals report that
other seemingly benign apps, such as family tracking or “Find
My Friends” apps [8, 29, 58], are being actively exploited by
abusers to perform IPS. We call these dual-use apps: they
are designed for some legitimate use case(s), but can also be
repurposed by an abuser for IPS because their functionality
enables another person remote access to a device’s sensors or
data, without the user of the device’s knowledge. Both overt
spyware and dual-use apps are dangerous in IPV contexts.
We provide the first detailed measurement study of mobile
apps usable for IPS. For (potential) victims of IPS, our results
are decidedly depressing. We therefore also discuss a variety
of directions for future work.
Finding IPS spyware. We hypothesize that most abusers find
spyware by searching the web or application stores (mainly,
Google Play Store or Apple’s App Store). We therefore
started by performing a semi-manual crawl of Google search
results. We searched for a small set of terms (e.g., “track my
girlfriend’s phone without them knowing”). In addition to the
results, we collected Google’s suggestions for similar searches
to seed further searches. The cumulative results (over 27,000+
returned URLs) reveal a wide variety of resources aimed at
helping people engage in IPS: blogs reviewing different apps,
how-to guides, and news articles about spyware. We found 23
functional apps not available on any official app store, and a
large number of links to apps available on official app stores.
We therefore design, build, and evaluate a crawling pipeline
for Google Play [3], the official app marketplace for Android.
Our pipeline first gathers a large list of potential IPV-related
search terms by using search recommendations from Play
Store, as we did with Google search. We then collect the top
fifty apps returned for each of the terms. Over a one-month
period, this approach retrieved more than 10,000 apps, though
many have no potential IPS use (e.g., game cheat codes were
returned for the search term “cheat”).
The data set is large enough that manual investigation
is prohibitive, so we build a pruning algorithm that uses
supervised machine learning trained on 1,000 hand-labeled
apps to accurately filter out irrelevant apps based on the
app’s description and the permissions requested by the app.
On a separate set of 200 manually labeled test apps, our
classifier achieves a false positive rate of 8% and false negative
rate of 6%. While we do not think this represents sufficient
accuracy for a standalone detection tool given the safety
risks that false negatives represent in this context, it suffices
for our measurement study. We discuss how one might tune
the pipeline to incorporate manual review to achieve higher
accuracy (and no false negatives), as well as initial experiments
with crowdsourcing to scale manual review.
We performed a smaller study using our measurement
pipeline with Apple’s App Store, and got qualitatively similar
results. See Appendix B.
The IPS landscape. The resulting corpus of apps is large,
with hundreds of Play Store applications capable of facilitating
IPS. We manually investigate in detail a representative subset
of 61 on-store and 9 off-store apps by installing them on
research phones, analyzing the features and user interface they
441
2018 IEEE Symposium on Security and Privacy
© 2018, Rahul Chatterjee. Under license to IEEE.
DOI 10.1109/SP.2018.00061