Amelie Schläffer is looking for your footprints (and other data)
Written by Allie Monck
Surveillance remains a contested zone. It has opened a field of study, is routinely refreshed, and become increasingly mundane. Scholars like Simone Browne push against the idea that “modern” surveillance has only emerged post-9/11, but rather the practice of being watched has been ongoing throughout history and disproportionately affects marginalized communities. While many of us may disregard surveillance as we have “nothing to hide,” others are not given that choice. Tools of surveillance permeate both our physical and digital landscapes, devolving into an aesthetic that is characterized as grainy, watching from above, and near stalker-esque.
Amelie Schläffer’s work commandeers this aesthetic to consider questions of data ownership. She became interested in data and surveillance after reading about the value of personal data. Selling online data has become a multi-billion dollar industry, yet the average internet user is often unable to determine how much their own data is worth or who it is being sold to. Schläffer’s project Data Footprints, featured in the STP BFA Show 2020, grapples with these ideas. Discussing her formulation of the project she notes, "I drew a comparison of when you walk through a public space, you leave behind traces such as a footprint in a park. Does that belong to you? Or is it free for anyone to use and do whatever they want with it?" She went to public parks in London, finding footprints in the mud and created plaster casts of them. She highlights, "I’m taking something that isn’t really mine, but it’s also not really anyone’s." This contested sense of ownership serves as a reflection of the dialogues around personal data happening in the tech sphere.
Schläffer then used her plaster casts in her video “Data Footprint” where she zoomed "into the texture and deeply examined every crevice. It almost transforms into a data landscape. The video is in black and white, giving that surveillance effect. When I showed it for my crit I had it on an old monitor." The aesthetics of surveillance have become almost definitive of a new form of visualization, whether it be the hazy screen of a CCTV monitor or the constant aggregation of personal data online through binary code ones and zeros behind every visual we publish. Schläffer mirrors the ways our everyday actions can be found, collected, and spread for another person’s purposes. She reflects, "I thought it would be interesting to know whoever made that footprint, because I took it and I was exhibiting it and it’s now able to be seen by people worldwide." Her ideas around data ownership become clear here, as the park-goer’s footstep has been transformed to have broader implications regardless of their intentions. Can our actions continue to be innocent if anyone is able to scoop that information up?
Schläffer recently completed her foundation at Central Saint Martin’s (CSM) in London, where she found a community of people she found "were interested in having conversations and not afraid to talk about abstract concepts." A class discussion prompted Schläffer to begin the “Data Footprints” project, emphasizing the importance of the quotidian in her work. She recalls feeling "so much power in hearing people talk about things you care about." Her time at CSM has provided her with a thought-provoking network of peers, an influence she is grateful for.
For now Schläffer is in London, continuing to go down many research rabbit holes to prompt upcoming work. She has been spending time on Google Earth, saying she has "just [been] going to Italy and walking around digitally to have that feeling of being somewhere else." This activity feels like a fitting pairing with her footprints, as Schläffer leaves digital footprints for Google to discover. One of her other projects, Ultra Candid, includes still images from Google Earth to address the implications of digital representations of reality. Her work often reifies the same systems she is questioning, reinforcing how difficult it is to work these questions out in a neutral realm. As surveillance continues to evolve and become more sophisticated, these interventions may become even more difficult to make.