Research Studio Essay - MA Design & Computation


Unstable Archives - Unstable Futures

Digital archives are considered as as unstable environments haunted by specters of future past and future present, producing (deadly) futures through technologies of patterning, prediction and proxies. Considering the future as unstable however also allows for a way in: because unstable futures are - after all- unwritten futures and the „noise" in the archive is generative, forming new relations and new order.


Computing technologies can be understood as literally unstable (MELT 2021) through their medial and technological condition of fragility. Data centers require constant maintenance and replacements, they require electricity and are susceptible to the climate. Important nodes of the Internet are placed in colder climates such as Sweden or Oregon, and the heat generated by computer systems poses one of their greatest threats (Starosielski 2014, p.2505). The digital archive is, after all, material, and the material revolts against its organization (Cubitt 2017, p.2).

Future technologies become past technologies increasingly quickly. Take the curious case of Landsdale Semiconductor Inc., a company specialized in keeping older technology products in production. Their best customer: the US Department of Defense. The company writes on their website: "As technology advances, Lansdale is also keeping an eye on the future.“ A technological future is in turn is always a technological past in the making, „obsolescence is innovation in reverse” (Altman).

The data archive is defined by friction at points where „the very collection of data rubs against accuracy“ and where uncertainty is introduced into the algorithmic system (Richardson 2019, p.4). The instability of the archive becomes present through noise - through wrongly classified information, through glitches, errors and erosions and through dirty data as a „"cache of surreptitious subaltern refusal“ (Hito Steyerl 2018, p.6). 
The subaltern meaning those inaudible, ghostly voices that disappear as an effect of Othering : those who are erased through the denial of difference (Spivak 1988, p.25). Glitches are performative, they change the performance of the program, they remind us of the ongoing exploitation through technologies that have become second nature through their deep embedded-ness. They are the „evidence that control is never complete“ (Cubitt 2017, p.2).

SPECTERS OF PAST AND YET-TO-BE

If we consider an algorithmically-determined future through the lens of a „past that hasn’t happened yet“ (Sterling 2001), the archive (as in, the data on which algorithms are trained and evolved with) becomes central to the formation, to the design of the future. The widespread practice of collecting data seemingly for the sake of it, is connected to the belief that such (past) data will be useful in the future, for example for predictive means. In her essay on Xeno-Patterning, Luciana Parisi considers how the future and history of humanity is steadily captured by data, with capitalist companies aiming to own the „master pattern“ (in reference to the „master key“), that would give them access to „the universal history of humanity: what has happened and could ever happen to the species, the planet, the solar system, and the galaxy”(Parisi 2019, p.38).
In the lecture-performance „Metric Mysticism“, artist Zach Blas suggests the crystal ball, apparent in the logo of Palantir Technologies, as a paradigm used by silicon valley entrepreneurs to imagine algorithmic information processing (Blas 2017-18). The palantir in Lord of the Rings after which the company co-founded by Peter Thiel is named is an all-seeing crystal ball. Data, under this paradigm, determines the future and is absolute in its power. The crystal ball represents both absolute datafication and total knowledge of the presence as well as a prophet of the future. Similarly, predictive technologies are built on the data archive and aim to help prevent a future risk before it manifests itself.

PREDICTIVE WARFARE & POLICING

Predictive Policing Technologies are more often than not an example of machine bias rather than an example of successful crime prevention. In the case of the COMPAS recidivism prediction algorithm, black defendants had been issued higher risk assessment scores than similar white defendants (Hanna 2020, p.503). Similarly, predictive policing software PredPol was shown to specifically target Poor, Black and Latino neighborhoods (Heaven 2020). 

Predicting the future also plays a central role in the „War on Terror“, especially in the context of drone killings and the calculation of risk. As Eyal Weizman notes, the US regulations on the operation of drones and drone killings, a person can only be targeted based on the actions they will do in the future in the form of an imminent threat, rather than as a punishment for an actual crime (Weizman 2016, p.39). As so called signature strikes, these drone strikes depend on pattern of life analysis which „seeks to map sustained patterns in daily rhythms and activities in order to identify traces of a potential enemy“ (Richardson 2019, p.3). What is such an „immanent threat“ abstracted from? The commission report on the 9/11 attacks speculated that the attack could have been prevented if only the data on travel patterns had been connected with the information already available in US government databases (Amoore, de Goede 2021, p.425). In the years to follow, threats were abstracted from pre-existing data - phone calls, transactions, movements, meetings - all collected and processed in the data archive; from „traces that might compose a chain of action“ (Weizman 2016, p.39).

In the context of counter-forensics, this data archive would in turn provide a means of reconstructing this chain of action, and to re-evaluate the supposed immanent threat. However, because predictive algorithms produce the present as a means of preventing the future present, they in turn produce the future itself. How can predictive algorithms be proven wrong when they are used preemptively, except in extremely clear cases of false positives? And even if such proof was possible, the archive remains under lock and key, as the difference in resolution of public and military satellite data shows. Only through other data, video-low or otherwise, can drone strikes be reconstructed. In the case of the counter-forensis on the 2012 drone strike in Miranshah, simulations of the building were used as proxies to help a witness access the traumatic memories of the attack (Forensic Architecture 2014). 

PROXIES AND CLIMATE MODELS

Proxies as stand-ins and procurators are intermediaries that stand for something else, they can be VPN networks, surrogates, puppet states, bots - they displace power (Levin, Tollmann 2017, p.9). In statistic and climate research proxies are used as an approximation, they don’t refer to an existing dataset or phenomenon but rather represent an „incomplete foray into an unknown terrain“ (Levin, Tollmann 2017, p.10). Climate models use data on spatial and temporal patterns of climate change from past centuries in order to assess the anthropogenic impact on the climate in the future. Proxies are spectral in the sense that they exist and structure our world, yet are not „the thing“, remaining in the „sphere of near-knowledge“ and „introduce the specter of the unknowable“ instead (Chun 2018). They can refer to both phenomenons in the past and present as well as the future, as proxies of the (yet) unknown or ungraspable. In the context of archive and future(s), proxies work to „extend the archive, the knowable, by capturing or syncing to what is not there” (Chun 2018). Seen as a pharmakon, the proxy also becomes a powerful tool for mobilization, as its ambivalence and intrinsic inadequacy allows for its use as a subversive means (Chun 2018).

SPECTRAL DATA

In Derrida’s hauntology, the specter represents what is not fully present (Derrida 1994, p.161), it marks a relation to what is not yet - what has not happened yet but is nonetheless „effective in the virtual“ - or what is no longer - something that does not really exist anymore but remains effective e.g. a compulsion to repeat or a fatal pattern (Fisher 2012, p. 19). Algorithms govern the sphere of the spectral: „race, gender, caste, culpability, ‚killability’“ (Spheres Editorial Collective 2019, p.4) are all socially constructed apparitions and in this sense immaterial, yet of bodily and material consequence at the same time (Hanna 2020, p.503). The data archive forms an „invisible crowd“, invisibly contributing to the manipulation of thought and behavior. Data stand in tradition to the „invisible crowds“ described by Canetti: the dead that influence a society through beliefs, stories, the spirits and demons that still govern over our present (von Wedemeyer, Pasquinelli, Caffoni 2022).

UNSTABLE FUTURES ARE UNWRITTEN FUTURES

If the data archive, haunted by specters of past and yet-to-come, is considered through the lens of power and politics, it becomes clear that in a similar fashion to the archive itself, it is governed by the authorities as well as private companies. According to Michael Richardson, is not only important to consider the way algorithms work or what they are doing, but rather to consider their broader (political) function „of evacuating the space of politics of the necessity of decision: algorithms appeal to the state precisely because they enable the deferral of responsibility“ (Richardson 2019, p.5). In short, predictive algorithms serve as proxies, as specters of the yet-to-be, while forming what is yet-to-be if given the power to - as they have in case of predictive policing, predictive drone killings or credit ratings. A system of prediction requires stability - the stability of classifications so that objects and bodies stay within their class (Richardson 2019, p.5).
However, this existence of stability is rather a symptom of its instability, as the lack of affordance for fluidity makes clear the limits of such a system. In the context of drone killings, it becomes essential to know wether a cluster of bodies is a prayer group or a terrorist meeting, wether a SIM-card is given to a family member, a vehicle borrowed (Richardson 2019, p.6). These classifactions on which prediction algorithms are built are unstable - both in dangerous, even lethal ways and as an opening for resistance.

The instability of the data archive is responsible for the (in)stability of the future, yet it is also a way of escaping such a future. Under the rules of this new predictive order, examining the past becomes a means of examining the future, what is collected in the data archive, what is categorized, and who these categories serve. Further, the instability is an opening for exploit, for hacking the archive, to increase its disfunction to a maximal level. To use proxies in speculative ways in order to destabilize others. To go beyond the hyperstitional trauma of algorithms designed to intervene proactively and bringing about its own material reality and to find counter-stories that tell differently of the yet-to-be.