Future of Work

The new AI-augmented job descriptions

Every job is becoming an AI-augmented job. Rewrite roles around human-AI collaboration, not just AI tool usage. Mid-size companies can lead this shift without corporate theater.

Every job is becoming an AI-augmented job. Rewrite roles around human-AI collaboration, not just AI tool usage. Mid-size companies can lead this shift without corporate theater.

Key takeaways

  • Stop asking for "AI experience" - focus on collaboration skills like prompt engineering and output evaluation instead
  • Every role needs judgment capabilities - humans supervise AI work, not the other way around
  • Test actual AI collaboration - give candidates real AI tools and see how they handle augmented workflows
  • Reframe jobs around value creation - what humans uniquely contribute when AI handles the repetitive work
  • Want to talk about this? Get in touch.

Last week a client showed me their new job posting. “5+ years AI experience required.” For a marketing manager role.

Five years ago, ChatGPT didn’t exist. That’s the absurdity we’re dealing with - companies copying old hiring patterns for fundamentally new work structures. After helping dozens of mid-size companies rewrite their job descriptions for AI-augmented work, I’ve learned something crucial: We’re asking the wrong questions entirely.

Why “AI experience preferred” misses everything

What kills me about current job postings: IMF research shows that 39% of today’s skills will become outdated or transformed by 2030 - and skill demands are changing 66% faster in AI-exposed roles. Yet most companies still write job descriptions like it’s 2019.

“Must have experience with ChatGPT.” Really? That’s like requiring experience with Google in 2005.

The real issue? The WEF found that 63% of employers cite skill gaps as their biggest barrier to business transformation. But they’re looking for the wrong skills.

You don’t need someone who’s used AI tools. You need someone who thinks in human-AI workflows.

A marketing manager who understands when to let AI draft content versus when human creativity matters. An analyst who knows which data patterns require human judgment versus algorithmic processing. These aren’t tool skills - they’re collaboration patterns.

At Tallyfy, we stopped asking “Do you know AI?” and started asking “Show us how you’d design a workflow where AI handles first drafts and humans add strategic insight.” The quality of candidates immediately improved. Not because they knew more tools, but because they understood the division of labor.

The real competencies that matter now

I was reading PwC’s AI Jobs Barometer - workers with AI skills command 56% wage premiums, up from 25% the prior year. But here’s what those “AI skills” actually are:

Prompt engineering isn’t writing. It’s structured thinking. Coursera’s analysis shows prompt engineers spend hours perfecting single templates used thousands of times. That’s systems thinking, not creative writing. And it’s evolving - Gartner now calls it “context engineering,” designing the relevant data, workflows, and environment so AI systems understand intent.

Output evaluation beats output generation. Anyone can generate AI content. The skill is knowing when it’s wrong. Research on LLM evaluation shows judging is easier than generating - which means your employees need editorial skills, not production skills.

Workflow design trumps tool mastery. Tools change weekly. The WEF estimates 22% of jobs will be disrupted by 2030, but the real change is workflow transformation. Employees who can reimagine processes matter more than those who memorized ChatGPT commands.

Think about it - when’s the last time a software update fundamentally changed how you work? Every week if you’re using AI. Tool expertise expires faster than milk.

Rewriting core job functions

Let me show you how we rewrote a financial analyst position last month.

Old version:

  • Prepare monthly financial reports
  • Analyze budget variances
  • Create forecasting models
  • Support management decisions

Boring. Generic. Could be from 1995.

AI-augmented version:

  • Supervise AI-generated financial reports for accuracy and context
  • Identify which variances require human investigation versus algorithmic flagging
  • Design prompts for AI forecasting, then validate assumptions
  • Translate AI insights into strategic recommendations management actually understands

See the shift? Every task assumes AI handles the grunt work. The human adds judgment, context, and translation.

The World Economic Forum found that 41% of companies plan workforce reductions by 2030 due to AI automation. But they’re creating 170 million new jobs while eliminating 92 million old ones - a net increase of 78 million jobs. The new jobs? They’re all about human-AI orchestration.

Templates that actually work

After testing dozens of formats, here’s what resonates with candidates who get it:

Marketing roles with AI augmentation

“You’ll collaborate with AI to produce content at 10x speed while maintaining brand voice. AI drafts, you direct. You’ll spend 20% of time on prompt engineering, 30% on strategic planning, and 50% on creative work AI can’t replicate - understanding customer emotions, crafting narratives that resonate, building authentic connections.”

Notice what’s missing? Tool names. Version numbers. Certification requirements.

Operations roles with AI support

“You’ll design workflows where AI handles data processing and pattern recognition while you focus on exception handling and strategic improvements. Success means knowing when to trust algorithmic recommendations and when human judgment overrides the model.”

This attracts operators who think in systems, not button-pushers who follow manuals.

Customer service augmentation

“AI will handle initial responses and routine queries. You’ll manage complex emotional situations, train AI on new response patterns, and identify where human empathy beats algorithmic efficiency. You’re not competing with AI - you’re conducting it.”

That last line came from a conversation with a client who used to be an orchestra conductor. Perfect metaphor.

Testing and hiring for human-AI collaboration

Here’s where most companies fail spectacularly. They test AI knowledge with quizzes.

“What’s a token limit?” Who cares?

Instead, we developed practical tests:

The correction test: Give candidates AI-generated content with subtle errors. Marketing copy with wrong brand voice. Financial analysis with flawed assumptions. Code with logical bugs. See who catches what.

The prompt iteration challenge: Start with a terrible prompt. Ask them to improve it iteratively. You learn more from their process than their result.

The workflow design exercise: “Here’s a manual process. Design an AI-augmented version.” The best candidates don’t just add AI - they reimagine the entire flow.

One candidate completely restructured our customer onboarding process during her interview. She identified three places where AI could handle repetitive tasks, two where human touch was essential, and one where the process shouldn’t exist at all. Hired immediately.

Stop looking backward. Indeed’s latest data shows AI-mentioning job postings are at 134% above baseline levels, with nearly 45% of data and analytics roles now requiring AI skills. But we still hire developers like we’re building COBOL systems.

Three things you can do tomorrow:

First, rewrite one job description. Pick your most urgent hire. Remove all tool requirements. Add collaboration capabilities instead. Watch how candidate quality changes.

Second, design one practical test. Not “explain AI” but “use AI to solve this real problem we face.” WEF research shows 85% of employers plan to offer upskilling and 77% provide AI training - but they need to test practical application, not theoretical knowledge.

Third, change your interview questions. Stop asking about experience. Start asking about approach. “How would you decide whether to use AI for this task?” beats “Have you used GPT-4?” every time.

The companies winning aren’t those with the most AI tools. They’re those who understand that every job is becoming a human-AI collaboration role. The question isn’t whether your employees will work with AI. It’s whether your job descriptions admit it.

Mid-size companies have an advantage here. No bureaucratic approval chains. No corporate doublespeak. You can write honest job descriptions that attract people who actually want to work this way.

Because McKinsey’s research shows 75% of current jobs will require redesign, upskilling, or redeployment by 2030. We’re all conductors now.

And the best conductors don’t play every instrument. They know how to make them play together.

About the Author

Amit Kothari is an experienced consultant, advisor, coach, and educator specializing in AI and operations for executives and their companies. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.

Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.