AI Anxiety Has a Name Nobody's Using: Autonomy Grief

Cover Image for AI Anxiety Has a Name Nobody's Using: Autonomy Grief

You're not worried about being replaced. You're worried about something more specific — and it doesn't have a name in most workplace conversations.

It's not that the job disappears. It's that the parts of the job that made you feel competent are the first things to go. The research, the synthesis, the judgment call you would have taken an hour to make. Now the tool makes it in eight seconds and you check the output. And you're good at checking output. But you were trained for something else.

That's the thing nobody is naming correctly.

What the Research Actually Shows

A 2025 paper in Frontiers in Psychology studying AI-related workplace anxiety identified something that didn't fit the standard "job displacement" narrative. Workers who were anxious about AI weren't primarily anxious about being fired. They were anxious about something the researchers called violations of "psychological contract" — the implicit agreement between a worker and their craft.

The psychological contract is not the job description. It is the unspoken deal: you develop expertise, and in exchange, that expertise gives you status, autonomy, and the feeling of being genuinely skilled at something. When AI automates the things that required expertise to do well, the contract breaks. Not the employment contract. The dignity contract.

A separate 2026 study in Frontiers in Psychology (Pahos et al.) found that algorithmic anxiety — specifically defined as anxiety arising from AI rather than general economic anxiety — predicted significantly higher rates of job insecurity and lower life satisfaction, independent of actual job security. People who were objectively not at risk of job loss still experienced high levels of algorithmic anxiety, because the loss they were perceiving wasn't economic.

The Difference Between Fear and Grief

Fear of job loss is about the future. Autonomy grief is about the present.

Fear says: what happens if the tool improves and they don't need me anymore? Grief says: the skill I spent years building feels different now. Less necessary. Less mine.

These require different responses and get them confused. Organizational responses to AI anxiety tend to focus on the fear dimension — reassurance about job security, upskilling programs, explicit statements that AI "augments rather than replaces." These are responses to a narrative about the future. They don't address the thing that's already happened.

The thing that's already happened is the erosion of mastery. And mastery — the capacity to do something hard that others can't, or can't do as well — is not a minor psychological feature. Psychologist Mihaly Csikszentmihalyi's research on flow states documented decades ago that humans derive disproportionate wellbeing from activities requiring skill and challenge. We're not wired to be purely economic creatures. We're wired to be competent ones.

When a tool removes the challenge without removing the job, something goes wrong that efficiency metrics don't capture.

Why "Upskilling" Misses the Point

The organizational solution to AI anxiety is almost always the same: upskill. Learn to work with AI. Develop "AI-adjacent" skills. Focus on uniquely human capabilities. This advice is not wrong. But it treats the grief as a logistics problem.

Upskilling addresses the economic dimension — what you'll do, what you'll be worth. It doesn't address what you've lost. Which is a way of being in your work that felt earned, because it was.

A copywriter who spent a decade learning to write well doesn't experience AI writing tools as a productivity upgrade. They experience them as a devaluation of a decade's worth of work. The devaluation isn't imaginary. The market for that skill has actually shifted. The grief is appropriate to the loss.

Treating that grief as irrational — as something to be coached away with productivity frameworks — creates another layer of injury. Not only is your mastery worth less, but you're apparently being unreasonable about it.

AI Anxiety Isn't About Your Job. It's About Your Status. explored the status dimension of this — how expertise deprecation feels like social positioning loss. Autonomy grief is the private version of the same phenomenon: not how you look to others, but how you feel to yourself when you're working.

What's Actually Being Lost

It's worth being specific about what "autonomy" means in this context, because the word is doing a lot of work.

Autonomy, in the psychological sense, is not about freedom from oversight. It's about the experience of being the genuine author of your outputs. When I write this sentence, it comes from my judgment — informed by what I know, what I've read, what I think about the argument I'm making. The sentence is mine in a way that means something.

When I use an AI tool to draft the sentence and then edit it, something changes. I am still making choices. I am still applying judgment. But the starting point shifted. I am in a different relationship with the output than I was before.

For most tasks, this is fine or actively better. But for the tasks where the authorship felt meaningful — where "this came from me" was part of why it mattered — the shift registers as loss even when the output is objectively as good or better.

This is particularly acute for knowledge workers whose professional identity is built around their thinking. AI Is Burning Out the People Who Embraced It Earliest documented the supervision fatigue side of this — the cognitive cost of being responsible for AI output without feeling genuinely responsible for the thinking behind it. Autonomy grief is the identity side. Not just tired. Estranged from your own work.

Naming It Is the Starting Point

Most people experiencing this don't have a name for it. They know they feel something — unease about AI tools, resistance they can't fully justify, a flatness where the work used to be engaging — but the frame available is "fear of job loss," which doesn't fit. They're not afraid. They're something else.

Naming it autonomy grief does a few things. It locates the loss accurately. It removes the implication that you're being irrational or resistant to progress. And it opens a more honest conversation about what would actually help.

What would actually help is not primarily retraining, though retraining has its place. It's redesigning work in ways that preserve meaningful agency — keeping humans genuinely in the loop on decisions that require the kind of judgment that still requires a person, rather than nominally in the loop while the AI makes every meaningful call.

That requires something from organizations that's harder than implementing a new tool: acknowledging that efficiency gains can coexist with psychological costs that are real and worth taking seriously.

The anxiety will keep rising until the conversation becomes more honest about what's being lost. Not jobs. Mastery. And the sense of self that came with it.


Cover photo by Andrea Piacquadio via Pexels