What we learned collecting algoglitches on Twitter
Algoglitch is no bug
Two months ago, as part of this project, we set up a twitter account called Sensibilités algorithmiques. This account is one tool in our research toolbox to explore our sensitivity to being calculated by algorithms, in particular in our use of digital platforms. The activity of the account itself wished to capture the discussion about algorithmic miscalculations on Twitter. By doing so, we wanted to carry on collecting data on "algoglitches" by asking the contribution of followers. But we were also seeking to actually define the term "algoglitch" itself in order to constitute it as a political object, therefore we were curious about the artefacts we would receive and tweet.
They turned out to be pretty different from what we expected.
We also launched the accompanying hashtag #algoglitch. We did not set a trend, but we learned about the pretty interesting history of the term "algoglitch" in the early 2010s. For instance, "algo glitch" made the headline of the Financial Times on August 1st, 2012, when the NY stock market cancelled trades due to a dysfunction in Knight Capital's trading algorithm.
"In the case of Knight Capital, which suffered a loss of $440m, Tom Joyce, chief executive, told Bloomberg on Thursday the implementation of new software contained a bug that sent erroneous orders into the market."
The event was fatal to Knight Capital. It led to its acquisition in December 2012 by Getco LLC.
This kind of bug is dreaded by everyone in a software-based society. Though not different than any other bug in a piece of code, its far-reaching consequences affect the delegation to algorithms of a core task in the financial capitalism infrastructure. Hence the name glitch, a transient coding error - maybe a euphemism ? - which could apply to electronic circuits, images, video games etc., as well as pricing algorithms.
The glitch has given rise to an art genre, enhancing creativity in many mediums like video and music, or even architecture.
The algoglitch is thus a disruption that affects our supposedly seamless interaction with algorithms, i.e. an experience where the algorithm disappears behind the activity we are doing : browsing the web, watching videos on youtube, shopping, driving a car, booking a Uber drive etc. The transparency of the algorithmic layer equals trust in the transaction we engage in, equals the obscurity of the calculations we are subjected to. The algoglitch breaks the chain. It enables to user to notice, to wonder "Am I being calculated ?" and to share their perception in the form of screenshots and posts on Twitter.
Screenshots are complex images that constitute entry points into invisible data infrastructures. Some are very sophisticated and multilayered, like the example below which combines a positive appreciation of Spotify's algorithm's job with a contradictory critical image (by the way, a successful encounter with algorithmic agency can occasionally be reported too).
However the contributions of followers of our @algoglitch account slightly altered the conceptual territory of the glitch. Whereas we expected wrongly targeted ads, and concerns about data privacy or tracking of the user's behaviour, we discovered another kind of algoglitch.
During the recent fire in California, LA police department issued a warning via twitter to request drivers not to follow navigation apps.
A woman tweeted about the weird behaviour by her automatic Roomba vacuum cleaner.
A PhD candidate in sociology was furious that youtube chose as thumbnail for her uploaded sociology dissertation the image of a racist meme she was actually criticizing.
In those 3 instances, the algorithm performs absolutely flawlessly : navigation apps direct the drivers through routes that have the less traffic (which it is supposed to do), the vacuum robot's cliff sensors interpret correctly the infrared signals it sends out (meaning it is approaching a dropoff or stairs), YT thumbnails' generator recognises rightly the only frame in the video that contains a human face (which optimizes the number of views).
Here is the carpet that confuses the roomba :
One could infer that proper interior design should not make use of colorful carpets with black drawings, nor should there be fires near roads, or slides with texts as creative content on Youtube.
The problem is not the failure of the algorithm to compute data, but rather its computing "wrong" data, irrelevant data to describe the environment in which the action takes place.
In 2011, a bit earlier before the "algo glitch" crash in NY stock exchange mentioned above, "algo glitch" is used by artist Daniel Temkin in the Motherboard interview There's not much glitch in glitch art. The artist does not appeal for glitch aesthetic. On the contrary, as the title of the article says, he disagrees with the popular use of the term "glitch" in art and offers an alternative glitching practice :
"What makes algo-glitch demented is how we misuse existing algorithms, running them in contexts that had never been intended by their designers."
In another article entitled Notes on Glitch, Temkin gives further insights about the fundamental difference between the mainstream definition of the glitch and his own : what makes computers so foreign to us humans is not that they make mistakes, but rather that they go on calculating whatever data they are being given. Even more: they usually compute those data well.
"The existence of glitch-based representation depends upon the inability of software to treat a wrong bit of data in anything other than the right way. The word “glitch” in this sense does not solely represent the cause that initiates some failure, but also the output that results when improper data is decoded properly. An isolated problem is encountered and, rather than shutting down, the software prattles on. Stated differently, it is a given program’s failure to fully fail upon encountering bad data that allows a glitch to appear. The instigation of such defect-driven churning is the crux of the practice known as Glitch Art."
Glitches are not algorithmic fails. There are the impossibility to fail.
Also helpful to us is the article Glitch as Infrastructural Monster. Authors Meredith and Nathan Johnson of the University of Florida tell about the air travel experience from Tampa to Indianapolis in january 2014. A pretty disagreeable one. The first part of the article tells about the outward journey in the midst of a powerful winter storm which puts the whole traveling industry at a grueling test (comprising of the company, the airport, information services, hotel reservations, alternative transportation means, weather conditions, travelers' patience etc.) The authors describe the apparition of an emergency infrastructure with a rather successful ability to adapt to the difficult circumstances. Now, in the midst of the chaos, Meredith's luggage gets lost only to be found at the end of their stay so she had to buy new clothes in the meantime. On their way back, weather conditions had returned to normal. So had the travel industry's operations. On the airport scale, Meredith's luggage which now comprised her original clothes plus the ones she bought, weighted more than the authorized limit. To her dismay she was requested to pay an addition fee as per airport rules. No explanation regarding the reason why the limit was exceeded seemed to waver the employee's or the manager's decision since, well, the scale had the last word about the matter. However at one point another scale is used thats gives the luggage a different weight. The incertitude concerning the luggage's weight enables the passengers to continue the conversation with a manager now much more willing to negociate. He finally lets Meredith go without paying the additional fee.
Now the point of the authors is that the glitch in the story is not the scale error (one of the scales - or perhaps both ? - are "dysfonctioning"). The real glitch is the collision between two types of infrastructure : the "status quo infrastructure" where routine procedures are applied, such as paying fees when your luggage is overweight ; and the "emergency infrastructure", handled very well by the airline in extraordinary situations such as the outward journey of Meredith and Nathan, where priorities are different and rules can be bent. It entails a different agency distribution between the actants - human and non-humans - involved: "The scale glitches and, like the infrastructure in which it is embedded, becomes visible “as something that manipulates and as something that can be manipulated” (Boyle 12). Monsters are misfits. They stand out. Accounts of infrastructural monsters are revelatory. By positing the malfunctioning scale as an infrastructural monster, this article takes glitching seriously “as the source of agency and thought rather than its limit”."
The consequence for our algorithmic glitches is that the right question to ask might not be how to report and correct computing errors, nor how to debunk human biases reified into algorithms, but rather how to produce "glitch-based representations", that would focus our attention on the conflict between different underlying infrastructures and distribution of agency. Starting with a method for looking at algoglitches, selecting and categorizing them.
Last Christmas, a frustrated father complained about the answer provided by the smart speaker Google Home Mini to his daughter's question : "Hey Google, what's Mrs. Claus doing ?"
The tweet was hashtagged #AIFail, yet we retweeted it. Hence we recategorized it as an algoglitch, since the algorithm, by performing flawlessly the task of looking for verified information about Santa Claus, also provided a totally different source of agency than that of the father's agency to create a world for their daughter where Santa Claus exists and brings presents and joy to children.