Discover more from Horrific/Terrific
🐍 Automation is a snake eating its own tail
The division of labour and the fallacy that automation provides freedom from work
Here’s a funny funny joke for you: a couple of weeks ago I was rejected from a job because I was not, according to them, passionate or knowledgeable about AI — a subject I write and think about constantly. If they asked me even a single question about AI they perhaps would have known that. So, it’s safe to say I’m tired of this bullshit; the job market is utterly horrendous at the moment. If you know anyone who needs someone like me, drop me an email! I know I started this paragraph with ‘here’s a joke’, but I’m actually quite serious. Here’s my CV.
Now, what you actually came here to do: Over the past week, I have been thinking A LOT about AI and human labour, and how one will effect the other. Scroll curiously downward for roughly this:
In late-stage capitalism, humans are expected to work like machines: either because the work is repetitive and alienating, or because it is actively cold and uncaring. Ironic, because these days AI has become very good at emulating humans? Anyway…
Scale and efficiency has seen labour become divided: in this sense, wages are suppressed, individuals are deskilled, and machines can more easily replace single-task repetitive jobs
Automation only creates more tasks: if 20% of your work can be automated, it will be replaced with work that is not yet automated. If 100% of your work can be automated… that really sucks.
Ultimately, ‘caring labour’ is the only thing that machines cannot replicate: but this kind of labour is horrifically undervalued — so it’s unclear what the goal is with this whole automation thing tbh
Dunno if any of you remember the movie Sicko, where Michael Moore spends two hours demonstrating the absolute state of the US healthcare system. There’s a part that still really sticks out for me: there was a woman they interviewed who worked for a health insurance provider, who spoke in detail about how often she had to reject people’s claims because they did not meet the provider’s narrow, inhuman conditions. By the end of her interview she was crying, tired and ashamed of how many times she had told someone that they didn’t get to have the treatment they needed, saying, “this is why I’m such a bitch on the phone to people” — because she couldn’t face exhibiting a single shred of humanity while doing this job. Makes sense; it’s easier to be cold and heartless when denying people their humans rights.
What this woman did at her job, and what many of us do in workplace or institutional contexts, was actively dehumanise herself in order to achieve outputs desired by her employer. This is simply a reality of capitalism: many of us have to make adjustments to ourselves so that we ‘fit in’ as productive and obedient employees; the pressure of having to make money to stay alive pushes us to engage in bizarre alienating activities, like sending progress reports to unchecked mailboxes, sorting items in fulfilment centres, or literally telling another human that they do not meet the criteria to get medical treatment they desperately need.
This clinical uncaring work is the work of machines. Modern humans learn — via educational institutions — to adhere to a culture in which dehumanising work is considered the norm. And then, as systems scale and become (supposedly) more efficient, labour becomes ever more divided, and then sub-divided further into very boring tasks. The profit-making mechanisms of our economic system mean that, of course, capitalists will lean on automation and AI to reduce the cost of labour and/or increase the speed of production. This is why I’m often quite alarmed at the critics who recoil in disgust when they see that profit-making businesses utilise these tools as much as they can, and potentially threaten to replace human workers. Why would they not do that? They’re just continuing exploitative practices that have existed for generations.made this very good point in his recent piece about AI and labour:
“Indeed, we might go further and say that the triumph of modern institutions is that they have schooled us even to desire our own obsolescence. If a job, a task, a role, or an activity becomes so thoroughly mechanical or bureaucratic, for the sake of efficiency and scale, say, that it is stripped of all human initiative, thought, judgment, and, consequently, responsibility, then of course, of course we will welcome and celebrate its automation”
So now we’ve reached this beguiling ironic stage of human existence where we have trained ourselves to behave like machines, and divided labour up into tasks that are inherently machine-readable, just so that machines can be trained to behave more like us. It’s like a snake eating its own tail. From this, it’s easy to assume that we will be pushed further into machine-like behaviour, while machines get to make art and write Netflix originals. But, if we consider what LM Sacasas is saying, there is obviously appetite to also automate the work that is mechanical or bureaucratic — and then there is an assumption that automating this work will free us from it, and thus give us more time to ‘be human’.
This is a kind of ‘automation is freedom’ fallacy. There seems to be a lack of understanding of what automation actually means if you live in a capitalistic society. Automation does not free you from anything (except maybe being able to afford food and housing). Automation suppresses workers and their already dwindling bargaining power. Automation can only provide freedom if we accept that it’s okay for humans not to work, and still receive everything they need to live. In other words, it requires a major shift political will. The rampant tech bro will insist that AI and automation will make your life easier, but what they mean is that you can have six jobs instead of just the one — because they absolutely love work and cannot exist outside of the logics of exploitation and ‘gaming the system’. This is fine I guess but I really think we can do better with these tools.
I think it’s extremely important to think about the division of labour when discussing how AI might change the way we work. Not all work is repetitive and alienating — some people get to do fun, challenging work where they create exciting new things or work on interesting complex problems. Take for instance an office building full of people doing intellectual labour (perhaps they are building the next ‘killer app’ idk). When they go home, a set of cleaners come in to clean the building. Technically, there is absolutely no reason why the office workers can’t clean up after themselves: they could all pitch in and empty rubbish bins, wipe down surfaces, and mop the floors if they really wanted to. But the work of intellectuals is somehow considered ‘higher’ than cleaning, and their time too valuable to spend on it — therefore cleaning tasks are immediately undervalued, and the role of ‘cleaner’ is created.
In this office building, the office workers engage in activities which are pretty varied, but purely intellectual. The cleaners, however, just clean. This is fine if you enjoy calm, meditative work. Making things clean and shiny is extremely satisfying. But cleaners are on whole underpaid, and perceived as easily replaceable. There’s a misconception that these workers are ‘replaceable’ because their work is easy. That isn’t it; it’s because they exist in a system of inequality and exploitation. Employers are very aware that behind every employee within the lowest pay-grade is someone else willing to do their job (and behind those people: a machine. But we’ll get to that). It has nothing to do with how easy/hard the work is (if you can even measure that) — it’s about how the labour itself is valued, and that there is a portion of the population who are desperate for work, and so will begrudgingly offer up their labour for much less than it’s worth.
Exploitative undervalued work not only keeps labour divided, but also deskills a large portion of the workforce. An underpaid cleaner or overstretched warehouse worker has much less money and time to enrich themselves for fun or even develop transferable skills that might lead to work that is paid more fairly. Furthermore, custodial staff at tech companies even have to sign non-competes so that they can’t seek similar work with competitors, which suppresses wages even further. However, those privileged enough to be doing intellectual labour probably have a wider range of skills, and the time and means to learn even more — as well as the money and resources to fight for their rights if unfairly dismissed.
All of this makes it pretty obvious that the kind of work which is most exposed to AI is work that is both undervalued and replicable by machines. So a ‘single task’ job like ‘cleaner’ could only really be replaced if a machine can replicate this work cheaply enough — and if it can, that’s an entire workforce of humans who have to find something else. Then, in an ‘intellectual’ job, one might find that certain tasks and activities are replaced, and eventually perhaps, certain career paths. We’re already seeing vast avenues of undervalued intellectual labour under threat of replacement by machines: this is in part why actors and writers are out on strike, because studio heads don’t care enough about that work for it to be ‘good’ — it simply just needs to be produced at a convincing enough standard.
All these observations help us flesh out two broader points:
Once you peel away the ‘automation is freedom’ fallacy, you see that the addition of AI in the workplace is the addition of more tasks. Those who see their tedious bureaucratic work tasks replicated by machines will find that they will be given more tasks so that their pay is justified.
Because of how labour is divided, those who engage in labour which is just one task over and over might see their entire job replaced — if that’s even possible. In this case, there seems to be just one kind of labour left to sell, that might be too disgusting to replicate even for the most morally challenged capitalist: caring labour.
Hello, I am slow-dying of malnutrition. Please help me
I once had a really awful job working for a shouty red-faced accountant. It was kind of split equally between doing bullshit social media marketing and being his assistant. The social media stuff mostly entailed sending the same message to different accountants on LinkedIn. He had no idea you could very easily automate this, and I did not tell him that I used LinkedHelper to message people every day — he honestly thought I was just very fast at copy-pasting. If I told him that I had automated this portion of my work, I knew he would either reduce my pay, or give me more soul-crushing tasks. I did not want either of these things.
The other half of my job for the accountant definitely counts as what David Graeber refers to as ‘caring labour’ in his book Bullshit Jobs. This kind of labour “is generally seen as work directed at other people, and it always involves a certain labor of interpretation, empathy, and understanding.” I spent a great deal of time and energy anticipating and attending to the accountant’s needs. It turns out that his main need was to yell at me, so I didn’t have this job for very long.
Caring labour covers a lot of bases: it stretches from literally raising a child or looking after someone who is unwell (which are both extremely undervalued jobs), to any job where you are seen to be ‘taking care’ of someone else: e.g. managing a calendar, running errands, serving food or, in the case of the accountant, just being a subordinate — someone to make them feel superior and important.
Machines cannot effectively replicate underlings. The whole point of having someone serve you is to demonstrate that you are rich enough to afford human labour. The entire illusion of status melts away if suddenly machines started serving us canapés and tucking our children into bed. What satisfaction would the accountant get from yelling at a machine? I also worked as a waitress for eight years, serving rich people at parties, charity dinners, and livery halls. How different would the guests feel if the delivery of their food from the kitchen to the table was somehow automated? It would definitely alleviate the pressure to say ‘thank you’ upon receiving the food but beyond that I don’t see the point. Imagine sitting in a lavish dining hall and having your dinner awkwardly delivered to you by a tray on wheels. 80% of my job as a waitress was smiling, saying yes, walking quickly, and — unfortunately — looking attractive. In other words: my job was to be a human.
A large part of ‘being human’ is also having and raising children. Machines DEFINITELY can’t do that. Nor do they need to: first of all it would be creepy and disgusting. Secondly, this labour is already completely unpaid. People raise children because they want to (I mean… you’d hope that’s why people do it). Within the framework of markets and commodities, raising a child does not need to be made cheaper by automation. It is quite literally a labour of love. In Bullshit Jobs, David Graeber interviewed a woman who volunteered for the Wages for Housework Campaign in the 70s and 80s, and found that their proposal to pay women for their caring responsibilities was met with equal measures of enthusiasm and disgust:
“The reaction we used to get on the street when we leafleted for Wages for Housework was, either women would say, ‘Great! Where can I sign up?’ or they'd say, ‘How dare you demand money for some thing I do for love?’ That second reaction wasn't entirely crazy, these women were understandably resistant to commodifying all human activity in the way that getting a wage for housework might imply”
Capitalism really does weave the harshest and most backward narratives. The ultimate piece of caring labour we engage in — to have children and therefore guarantee the continuation of our own species — is completely unpaid, even though we require money to live. In fact, it’s considered by some to be morally reprehensible to receive money to take care of your own children.
Let’s go back to the woman in Sicko who’s job it was to refuse care at times. She got paid to do this. This vapid uncaring work is valued higher than caring work, because it requires you to dehumanise yourself — it’s a horrible soul-crushing job so it’s only fair that you get a bit of money in exchange. Whereas caring labour simply entails ‘being human’, which means you should require no compensation, because you are engaging loving and fulfilling work. And that should be enough.
With these ridiculous logics in mind, it makes me wonder what the builders and purveyors of new AI systems are actually aiming for, if we are truly never going to be adequately compensated for caring labour. Why are they so desperate to replicate human output and automate huge tranches of paid work? Who does this actually help in the long run? They seem to be gunning for a strange, bleak, and unreasonable future where we can’t just live our lives, and instead have to engage in the unpaid labour of ‘being human’ where we produce proto-content for machines to ingest and replicate. That can’t be what anyone wants! So why are we automating labour?
*Becomes a luddite and disappears forever*