The new AI-augmented job descriptions
Every job is becoming an AI-augmented job. Rewrite roles around human-AI collaboration, not just AI tool usage. Mid-size companies can lead this shift without corporate theater.

Key takeaways
- Stop asking for "AI experience" - focus on collaboration skills like prompt engineering and output evaluation instead
- Every role needs judgment capabilities - humans supervise AI work, not the other way around
- Test actual AI collaboration - give candidates real AI tools and see how they handle augmented workflows
- Reframe jobs around value creation - what humans uniquely contribute when AI handles the repetitive work
- Want to talk about this? Get in touch.
Last week a client showed me their new job posting. “5+ years AI experience required.” For a marketing manager role.
Five years ago, ChatGPT didn’t exist. That’s the absurdity we’re dealing with - companies copying old hiring patterns for fundamentally new work structures. After helping dozens of mid-size companies rewrite their job descriptions for AI-augmented work, I’ve learned something crucial: We’re asking the wrong questions entirely.
Why “AI experience preferred” misses everything
Here’s what kills me about current job postings. Indeed’s research found that 46% of skills in typical US job postings face “hybrid transformation” - where AI handles routine work but humans maintain oversight. Yet most companies still write job descriptions like it’s 2019.
“Must have experience with ChatGPT.” Really? That’s like requiring experience with Google in 2005.
The real issue? McKinsey discovered that 46% of leaders identify skill gaps as their biggest AI adoption barrier. But they’re looking for the wrong skills.
You don’t need someone who’s used AI tools. You need someone who thinks in human-AI workflows.
A marketing manager who understands when to let AI draft content versus when human creativity matters. An analyst who knows which data patterns require human judgment versus algorithmic processing. These aren’t tool skills - they’re collaboration patterns.
At Tallyfy, we stopped asking “Do you know AI?” and started asking “Show us how you’d design a workflow where AI handles first drafts and humans add strategic insight.” The quality of candidates immediately improved. Not because they knew more tools, but because they understood the division of labor.
The real competencies that matter now
I was reading PwC’s AI Jobs Barometer - workers with AI skills command 25% wage premiums. But here’s what those “AI skills” actually are:
Prompt engineering isn’t writing. It’s structured thinking. Coursera’s analysis shows prompt engineers spend hours perfecting single templates used thousands of times. That’s systems thinking, not creative writing.
Output evaluation beats output generation. Anyone can generate AI content. The skill is knowing when it’s wrong. Research on LLM evaluation shows judging is easier than generating - which means your employees need editorial skills, not production skills.
Workflow design trumps tool mastery. Tools change weekly. Goldman Sachs estimates 6-7% workforce displacement, but the real change is workflow transformation. Employees who can reimagine processes matter more than those who memorized ChatGPT commands.
Think about it - when’s the last time a software update fundamentally changed how you work? Every week if you’re using AI. Tool expertise expires faster than milk.
Rewriting core job functions
Let me show you how we rewrote a financial analyst position last month.
Old version:
- Prepare monthly financial reports
- Analyze budget variances
- Create forecasting models
- Support management decisions
Boring. Generic. Could be from 1995.
AI-augmented version:
- Supervise AI-generated financial reports for accuracy and context
- Identify which variances require human investigation versus algorithmic flagging
- Design prompts for AI forecasting, then validate assumptions
- Translate AI insights into strategic recommendations management actually understands
See the shift? Every task assumes AI handles the grunt work. The human adds judgment, context, and translation.
The World Economic Forum found that 40% of employers expect workforce reductions where AI automates tasks. But they’re creating 11 million new jobs while eliminating 9 million old ones. The new jobs? They’re all about human-AI orchestration.
Templates that actually work
After testing dozens of formats, here’s what resonates with candidates who get it:
Marketing roles with AI augmentation
“You’ll collaborate with AI to produce content at 10x speed while maintaining brand voice. AI drafts, you direct. You’ll spend 20% of time on prompt engineering, 30% on strategic planning, and 50% on creative work AI can’t replicate - understanding customer emotions, crafting narratives that resonate, building authentic connections.”
Notice what’s missing? Tool names. Version numbers. Certification requirements.
Operations roles with AI support
“You’ll design workflows where AI handles data processing and pattern recognition while you focus on exception handling and strategic improvements. Success means knowing when to trust algorithmic recommendations and when human judgment overrides the model.”
This attracts operators who think in systems, not button-pushers who follow manuals.
Customer service augmentation
“AI will handle initial responses and routine queries. You’ll manage complex emotional situations, train AI on new response patterns, and identify where human empathy beats algorithmic efficiency. You’re not competing with AI - you’re conducting it.”
That last line came from a conversation with a client who used to be an orchestra conductor. Perfect metaphor.
Testing for real AI collaboration
Here’s where most companies fail spectacularly. They test AI knowledge with quizzes.
“What’s a token limit?” Who cares?
Instead, we developed practical tests:
The correction test: Give candidates AI-generated content with subtle errors. Marketing copy with wrong brand voice. Financial analysis with flawed assumptions. Code with logical bugs. See who catches what.
The prompt iteration challenge: Start with a terrible prompt. Ask them to improve it iteratively. You learn more from their process than their result.
The workflow design exercise: “Here’s a manual process. Design an AI-augmented version.” The best candidates don’t just add AI - they reimagine the entire flow.
One candidate completely restructured our customer onboarding process during her interview. She identified three places where AI could handle repetitive tasks, two where human touch was essential, and one where the process shouldn’t exist at all. Hired immediately.
What this means for your hiring
Stop looking backward. Software development faces the biggest transformation - 82% of programming postings involve skills where AI handles routine coding. But we still hire developers like we’re building COBOL systems.
Three things you can do tomorrow:
First, rewrite one job description. Pick your most urgent hire. Remove all tool requirements. Add collaboration capabilities instead. Watch how candidate quality changes.
Second, design one practical test. Not “explain AI” but “use AI to solve this real problem we face.” Deloitte’s research shows workers prefer combining tech tools with human interaction. Test for that combination.
Third, change your interview questions. Stop asking about experience. Start asking about approach. “How would you decide whether to use AI for this task?” beats “Have you used GPT-4?” every time.
The companies winning aren’t those with the most AI tools. They’re those who understand that every job is becoming a human-AI collaboration role. The question isn’t whether your employees will work with AI. It’s whether your job descriptions admit it.
Mid-size companies have an advantage here. No bureaucratic approval chains. No corporate doublespeak. You can write honest job descriptions that attract people who actually want to work this way.
Because here’s the thing - McKinsey found we’re all techies now. But more importantly, we’re all conductors now.
And the best conductors don’t play every instrument. They know how to make them play together.
About the Author
Amit Kothari is an experienced consultant, advisor, and educator specializing in AI and operations. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.
Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.