Algorithmic Systems, Strategic Interaction and Bureaucracy
1. A L G O R I T H M I C S Y S T E M S ,
S T R A T E G I C I N T E R A C T I O N
A N D B U R E A U C R A C Y
Airi Lampinen
@airi_
tinyletter.com/airi_
2. “ T H E S H A R I N G
E C O N O M Y ”
A N D E X C H A N G E
P L A T F O R M S
3. P O S T - I N T E R A C T I O N H C I
• Systems that are used in an interactive
‘loop’ where there is no direct dialogue
• There is still input and output but they are
not tightly coupled
• What are the most crucial socio-technical
questions regarding city bikes?
4. W O R D S T H A T S E E M T O
B E E V E R Y W H E R E T H I S
Y E A R
• Algorithms
• Artificial intelligence
• Machine learning
• Data
8. “The logic of how these algorithms have been applied
follows from the longstanding ideals of bureaucracies generally:
that is, they are presumed to concentrate power
in well-ordered and consistent structures.
In theory, anyway. In practice, bureaucracies tend toward
inscrutable unaccountability, much as algorithms do.
By framing algorithms as an extension of familiar
bureaucratic principles, we can draw from the history
of the critique of bureaucracy to help further
unpack algorithms’ dangers.“
Notas del editor
This is a talk I gave in the Making Sense of Algorithmic Systems symposium at the Annual Social Psychology Conference in Helsinki, Finland, November 18, 2017. It represents early thinking on new ideas. These have been developed in collaboration with Matti Nelimarkka, Jesse Haapoja,. Juho Pääkkönen & others. In what follows, all mistakes are mine.
Most of my recent research deals with the so-called sharing economy. That’s not what this talk is about but if you’d like to hear more, I’d be happy to hear from you. For my academic papers, take a look at https://scholar.google.com/citations?user=e9VSTe0AAAAJ&hl=en
What might post-interaction HCI (Human–Computer Interaction) look like? This is a conceptual shift we are grappling with and trying to make sense of. Inspired by Alex Taylor’s thoughts, I like to think of city bike systems as one example that pushes us to think about things in new ways. If this isn’t business as usual, then what? But also what can we learn from the past?
And the more we launch into talking about algorithmic systems, the more we need to ask ourselves to be precise about how exactly they are different from socio-technical systems more broadly?
Here are some words I’ve heard awfully often this year — and there are problems with how they are used in public (and also academic) conversations. Lots of fear-mongering, moments where systems are narrated to hold more power and capabilities than they actually have etc. One things that seems to be clear is that all things digital and the datafication of everything is attracing a lot of attention.
The good news is, critical researchers are already on these phenomena, too. There has been a proliferation of critical studies of algorithms and data. This resource, collected by Nick Seaver and Tarleton Gillespie is one fantastic place to start from if you’d like to get a glimpse of what is going on: https://socialmediacollective.org/reading-lists/critical-algorithm-studies/
We need to keep asking questions about what algorithms are and in what way are they interesting. One important observations underlying the shift to talk about algorithmic systems rather than algorithms on their own is the fact that algorithms don’t exist in isolation. On this account, I reccomend this short piece by Paul Dourish!
Another source of inspiration for me has been this popular piece on the similarities between bureaucracy and algorithmic systems: http://reallifemag.com/rule-by-nobody/ The analogy does not work 1:1, of course, but there is something to it. And this points to where I think social psychology has an opening to step in and speak up: our field has a lot of expertiseon social interactions, also strategic ones, and organizations. These are needed in conversations about algorithmic systems.
Quote from ”Rule by Nobody” by Adam Clair
A further source of inspiration is this a less known book by Erving Goffman – a microsociological take of game theory! As I see it, there are (at least) two levels worth thinking about here:
1) Computer-mediated communication: how does social interaction play out in the context of algorithmic systems, how do individuals and groups use these systems in strategic ways in interacting with others?
2) Human–computer interaction: how do individuals and groups ”game the algorithm”? Here, one might think about Uber drivers strategizing with one another (and against the company and its app) to make more money, but also about the kinds of workarounds that have long been observed as part of the ”normal” repertoire of how people make socio-technical systems work.
Goffman’s work gives us tools to consider how individuals can interact with algorithmic systems (and with one another in the presence of these systems) in active, purposeful ways, rather than the dopes fooled by black boxes that popular accounts sometimes make us to be! But we need to be careful in considering what we can take from this work, focused on rich interactional settings (face-to-face).
And then, bureaucracy itself has of course been studied a lot. I presume that if you think of someone when you hear the word bureaucracy, that someone might be Max Weber. I however am intrigued to revisit Michel Crozier’s work as a resource for thinking about interactions with algorithmic systems. Crozier’s work challenges perspectives that overemphasize the rational organizational structure of bureaucracy, and places emphasis on the strategic efforts of different stakeholders within these organizational systems. Looking at algorithmic systems from this point of view allows for analysing strategic interactions on the system level in a manner that does not do away with the impact of networked systems but also keeps us focused on the possible tensions between the different human actors
There has been a lot of debate following Cathy O’Neil’s op-ed in New York Times (https://t.co/DnsK7Vv7fH). These tweets are part of that. And they match my argument that it is time for the traditional disciplines to set up, too, not just institutes that are specifically targeted at understanding contemporary technologies and their implications.