Decolonial and Performing Algorithms

We held the last of our “Critical Conversations” series on Monday, April 18th and we focused the discussion on algorithmic accountability and algorithmic bias. We also spent a final hour discussing what the next steps might be for the collective that has emerged from these conversations.

We began with a discussion of what exactly algorithms are and do – how algorithmic thinking works both in computational and everyday decision making practices (at the end of the post are several helpful resources). Drawing on this conversation, we began thinking a bit about just how much tacit knowledge is actually expressed in the process of making algorithms and algorithmic decisions. Touching lightly again on our theme of the displacement of knowledge from the body, we noted that algorithms obfuscate personal knowledge. We spent time mulling over the challenges of transparency with respect to that knowledge, as well as those of trying to document a process that is often iterative, multiply authored, and very often relies on hunches and guesswork. We dreamed of a visualization of the volume of decision making that goes into the harvesting or marshaling a single point of data.

We also talked at length about relationality and the ways that algorithmic approaches often obscure the relationships between networks of people and things.  We wondered about ways to think of the mess or the cacophony of relational information an asset rather than a barrier. Even as we championed the value of good metadata for discovery and use in systems, archives and the like, we wondered if we could get away from the disciplinary processes of colonial knowledge systems. As one participant observed, all cultures seem to want to “tame knowledge” so categorization and abstraction show up in a multitude of epistemes, not only those of western enlightenment thought. This led us then to ask about the possibilities of “co-prioritization” of knowledge systems – thinking of a “many paths” approach to knowledge and decision making that would create space for a plentitude of algorithms, each derived from and suited to different knowledge systems. We wondered, would this offer us the possibility of decolonial algorithmic cultures?

We noted that public awareness of how algorithms work is relatively low. We also talked a bit about the techno-utopian approaches to algorithmic optimization and the myth of the apolitical/neutral algorithm. Even as we lamented the sense in many spaces that algorithmic processes are neutral, one of our participants observed that the military has a clear mandate to keep human action linked to algorithmic work, even if only to “have someone to fire” when things go wrong. So while it may seem that the law has a long way to go in terms of understanding who might be held accountable for algorithmic bias, we do have a model that demonstrates a clear-eyed understanding of accountability even at a distance.

This then led us to daydream about a research project that we might call Performing Algorithms, which would allow us to demonstrate to everyday users just how powerful algorithmic personalization can be for our individual and collective experiences in digital cultures. We imagined a set of user profiles that could then be activated in an installation where a single search string would produce a range of results from the same search engine. We imagined it as a kind of gesture to the multiplicity of the web and the powers of “customization” and “optimization” for shaping an experience that many people assume is universal rather than highly crafted and, in some sense, personal. As one person put it, this project would allow us to demonstrate the ways that person’s life is “vectored” by largely hidden algorithmic processes.

We concluded with an exciting discussion of future work…but that will have to wait for another day. 🙂

A handful of resources on algorithmic culture and studies

Ted Striphas “What is an Algorithm?”

Algorithm Reading Club

also @alogrithm_club and #algclub

“Information Intermediaries Playlist” (includes Soraya Chemaly and Catehrine Buni on the “Unsafety Net”)

Soraya Chemaly and Catherine Buni “The Secret Rules of the Internet”

A collaborative statement on “Intersectional Data” (contact J. Wernimont for more info)

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s