AI anxiety is not about the technology - it is about losing control
Employees are not afraid of AI algorithms. They are afraid of losing agency over their work, their relevance, and their future. Here is how to fix it.

Key takeaways
- AI anxiety stems from loss of control, not technology fear - Job displacement fears nearly doubled from 28% to 40% in two years, driven by employees feeling like passive recipients of change
- Leadership silence amplifies anxiety - Fewer than 20% of employees have heard from their manager about how AI will affect their job, making the communication gap itself a source of fear
- Involvement beats reassurance every time - When employees help select and customize AI tools rather than just receive updates, anxiety drops and 62% feel leaders finally understand the emotional impact
- Training gaps create retention risks - Only one-third of employees received any AI training last year, and 36% planning to resign cite inadequate development as a driving factor
- Need help implementing these strategies? Let's discuss your specific challenges.
I was looking at EY’s research on AI anxiety when something jumped out at me. 71% of employees are worried about AI. But most people miss this: they are not afraid of the algorithms.
They are afraid of having no say in how those algorithms change their work.
The difference matters. Because if you think AI anxiety workplace problems are about technology fears, you will try to solve them with better explanations of how the technology works. When what people actually need is control over how it gets used.
At Tallyfy, we saw this pattern clearly. Automation anxiety disappeared the moment people realized they were becoming workflow designers rather than workflow followers. Not because we explained the technology better. Because we gave them the steering wheel.
Why AI anxiety is really about control
Research from Nature shows that AI adoption significantly undermines psychological safety, which drives depression and stress. But dig into why. It is not the technology itself. It is the feeling of powerlessness that comes with it.
When you roll out AI tools to your team without involving them in the selection process, without letting them customize how the tools work, without giving them authority over when and how to use them - you have just told them they are now passive recipients of whatever comes next.
That feeling? That is what drives AI anxiety workplace issues.
Gartner found five distinct fears employees have about AI. Every single one traces back to control. Not fear of losing their job to robots. Fear of having their job redesigned without their input. Fear that inaccuracy or bias will affect their performance and they will have no way to fix it.
Mid-size companies actually have an advantage here. You can give people real influence over AI decisions. You are not dealing with enterprise-scale bureaucracy where every choice gets made three levels above the people doing the work.
The psychology behind workplace AI anxiety
Here is what actually happens when you introduce AI without giving people agency.
First, their sense of professional identity erodes. The work they spent years getting good at? Now a tool does it differently. They did not choose this. They did not shape it. They just showed up one day and their job changed.
Second, you create what researchers call the autonomy-control paradox. Studies show that job autonomy satisfies people’s need for control and increases engagement. But algorithmic control moderates that relationship. The AI makes decisions that used to be theirs. Even when the tool makes them more productive, they feel less in control.
Third, you trigger anticipatory anxiety. Not about what is happening now, but about what comes next. If this decision got made without them, what other decisions will? Mercer’s research shows job displacement fears nearly doubled - from 28% in 2024 to 40% in 2026. Deutsche Bank analysts warn that “anxiety about AI will go from a low hum to a loud roar” this year.
They are apprehensive because they do not know what control they will have over their future. And here is what makes it worse: fewer than 20% of employees have heard anything from their direct manager about how AI will affect their job. The silence itself becomes the message.
Giving employees agency instead of reassurance
Stop telling people AI will not replace them. 62% of employees already feel their leaders underestimate AI’s emotional and psychological impact. Empty reassurance makes it worse.
Start showing them how they will direct AI to do better work.
The distinction is everything. Reassurance is passive. Agency is active. One makes people feel better temporarily. The other changes the actual power dynamic.
What this looks like in practice: When you are evaluating AI tools, include employees from all levels in the selection process. Not as rubber stamps. As actual decision-makers. EY found that 77% of employees would be more comfortable with AI if people from all levels were involved in adoption decisions.
Give employees authority to customize how AI tools work in their specific contexts. Let them set boundaries on what the AI handles versus what stays human. Let them override AI recommendations when their judgment says otherwise.
This is not about making everyone feel good. It is about building systems where people maintain meaningful control over their work.
One company I know lets teams vote on whether to adopt specific AI features. Not all features. Just the ones that change how core work gets done. They have slower adoption. They also have virtually no AI anxiety workplace problems. Because people chose this.
Building support systems that actually work
Training helps. But not the way most companies do it.
Research shows that self-efficacy in AI learning moderates the relationship between AI adoption and job stress. Higher self-efficacy weakens the stress connection. But you do not build self-efficacy through mandatory training sessions where someone talks at people for three hours.
You build it through peer support networks and safe experimentation spaces.
Set up learning groups where employees teach each other what they have figured out. Not formal training. Just spaces where someone who discovered a useful AI workflow can show five colleagues how it works. Where people can ask questions that feel basic without worrying they should already know the answer.
Create tech champions - early adopters who are enthusiastic but not evangelical. They provide hands-on help. They share what went wrong when they tried something, not just what went right. Organizations with help desks and regular follow-up sessions see significantly lower resistance to technology adoption.
The key is making support ongoing rather than one-time. AI tools keep changing. Your support system needs to change with them.
And keep this in mind: only one-third of employees report receiving any AI training in the past year. Meanwhile, over 90% of enterprises are projected to face critical skills shortages by 2026. That gap? That is where anxiety grows.
Long-term culture shifts that reduce anxiety
This is not about managing a transition. This is about building an organization that can handle ongoing technological change without creating ongoing anxiety.
The pattern you want to establish: employees have real influence over tools and processes, not just nominal input. When that becomes normal, AI anxiety workplace concerns become manageable rather than existential.
Some practical culture shifts that work:
Make experimentation explicitly safe. Create spaces - literal slack channels or meeting times - where people can test AI approaches and discuss what failed without any performance implications. Clinical psychologists report increasing numbers of workers discussing AI anxiety in therapy - the most common fear being “becoming obsolete.” When psychological safety exists, people treat AI as something they can shape rather than something that shapes them.
Build feedback loops that actually change things. When someone says an AI tool is creating problems, and you fix it based on their input, you just demonstrated they have control. When you listen but nothing changes, you proved they do not.
Give people authority to disconnect from AI when it makes sense. Sometimes the human approach works better. If employees need permission to override the AI, you are telling them the algorithm has more authority than they do.
Link AI adoption to skill development rather than job replacement. Workers with AI skills command wage premiums up to 56% higher than those without. When someone becomes good at directing AI, does that open new opportunities for them? Or just make them more efficient at their current job? One creates positive anticipation. The other creates resignation.
36% of employees planning to resign within a year cite inadequate training and development as a driving factor. And 45% of leaders say they would leave their company if it significantly lagged in AI adoption. The anxiety cuts both ways. But communication without agency still leaves people as spectators. You want them as participants.
Mid-size companies can move faster on this than enterprises. You can change actual practices, not just write new policies. You can give teams real budget authority to choose their tools. You can let someone who finds a better AI approach roll it out to their department without eighteen approval layers.
The companies that handle AI well will not be the ones that explained it best. They will be the ones that gave people control over how it changed their work. That starts with recognizing what AI anxiety workplace issues actually are - control problems disguised as technology problems.
Fix the control problem. The anxiety takes care of itself.
About the Author
Amit Kothari is an experienced consultant, advisor, coach, and educator specializing in AI and operations for executives and their companies. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.
Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.