Privacy Tools as a Vault for Concealing Illegal Activity

Often child safety on the internet, specifically the issue of child pornography, is generally represented as an ongoing war between law enforcement officials who seek to protect, and third party offenders who seek to take advantage of the relative vulnerability of children in online networks. However, this rather insular and dualistic view of the issue, valorized by media that seek to oversimplify the challenges associated with keeping children safe online, avoid the potential for minors to criminalize themselves.

In 2015, The New York Times reported that students in Cañon City, Colorado could face criminal charges after an investigation discovered they were exchanging hundreds of nude pictures of themselves and other teenagers on their phones, using free apps available on iOS and Android in order to keep these images secret. Using ‘Photo Vault’ apps, apps that enable users to conceal images on their device during regular patterns of use, these minors (grades 9 thru 12,) were able to hide a private locker of images behind an interface that masquerades as a calculator app.

This instance pits two issues against once another: right to privacy vs. child safety online (-since the apps themselves did nothing wrong, they cannot be held responsible for the way these users take advantage of their software.) On the one hand, nothing about these apps invites criminal activity: they’re designed to provide an insulated space for privately storing images, so there’s no grounds on which to remove them from the app marketplace. On the other hand, if children are able to weaponize these vault apps in order to store illegal content en masse, such as pornography of themselves and of others, one question arises: what conditions for access can be set in place in order to protect minors from themselves online ?

Framing the issue in this way risks depicting children as borderline sleeper bodies (criminality hiding within the seemingly innocent body of the child.) Perhaps the answer cannot be in simply removing the tools, or in removing the right to privacy, but in building into these tools various forms of education about what sort of materials stored within these vaults constitute illegal content. One technical solution might be building into the IPTC Core a level of metadata capable of uniquely identifying images already established as child pornography, tracing their path through Photo Vault apps on various phones, in order to enable a user to force deletion of such images from all devices that share those images, remotely. Effectively, this solution would require all users to agree to new Terms of Use that would enable remote deletion by users who have sent out these illicit images. While this might help individuals protect their own privacy, and enable law enforcement to track down illicit images established as illegal content, it holds no promise that another Photo Vault app won’t appear in the app marketplace immediately to counter this change in the technology.

Source: NYTimes: Colorado Students Caught Trading Nude Photos by the Hundreds


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: