Two years ago this CEO fired 80% of staff for refusing AI – now he says he was right

The fallout still divides the tech industry.

In early 2023, when generative AI was triggering more panic than plans inside most companies, IgniteTech CEO Eric Vaughan forced a choice on his staff: embrace AI fast, or leave. Most left. Now, as he touts soaring margins and new patents, his radical experiment is becoming a reference point in every argument about what AI adoption should look like at work.

A blunt ultimatum: love AI or leave the company

IgniteTech, a US-based enterprise software firm, was hardly a household name in 2023. Inside the company, though, the arrival of ChatGPT and rival tools triggered what Vaughan describes as an “existential” moment.

While many executives treated AI as an interesting add‑on, he treated it as a survival deadline. Staff were told that AI was not a side project or an innovation lab toy. It would define who stayed on the payroll.

For Vaughan, failing to adopt AI quickly was equivalent to putting the company on a path to an early death.

According to his account shared with Fortune, those who resisted this shift were not just sceptical; they were seen as obstructing a transformation he believed was non‑negotiable. Within a year, about 80% of IgniteTech’s workforce had been shown the door and replaced with people recruited specifically for their “AI-compatibility”.

‘AI Mondays’: a company rewired around automation

Vaughan’s approach did not start with layoffs. It started with ritual. He introduced “AI Mondays”, a weekly rule that applied to everyone in the firm.

Each Monday, employees were banned from doing their usual work. No client calls. No budgeting. No routine admin.

Every Monday, staff were allowed to work only on AI-related projects, regardless of their job title.

The goal was to force the organisation to build muscle memory around AI tools and workflows. Instead of asking employees nicely to “try ChatGPT when you have time”, he cleared their schedules and made experimentation mandatory.

➡️ Putting a dry sponge in the vegetable drawer of your fridge helps regulate humidity and keeps produce fresh for a longer time

➡️ Sheets shouldn’t be changed monthly or every two weeks : microbiologists reveal what actually builds up faster than you think

➡️ Bad news for a captain proud of the fastest nuclear submarine in history: the K-222 is scrapped as an expensive mistake that divides opinion

➡️ The habit that quietly keeps kitchens under control

➡️ The unexpected deployment of aging tanks on NATO’s eastern flank, a move sparking outrage among allies and skeptics of deterrence

➡️ Goodbye hair dyes : the new trend that covers grey hair and helps you look younger

➡️ Bird lovers use this cheap February treat to keep feeders busy and attract birds every morning

➡️ Psychology says people who say “please” and “thank you” reveal more about their upbringing than most realize

He also poured money and time into training. Staff were promised what he framed as a “gift”: heavy investment in tools, education and AI projects designed to give each person a new, marketable skillset.

Who resisted AI – and why that surprised the CEO

Despite the training and dedicated time, large pockets of the workforce pushed back in subtle and not‑so‑subtle ways. Vaughan says some employees delivered deliberately weak work when they used AI tools. Others simply failed to show up to prompt‑engineering sessions.

The group dragging its feet most, according to his account, was not the back office. It was the technical staff: software engineers and developers who many outsiders would assume were naturals with automation.

The reasons were varied. Some feared being automated out of a job. Others distrusted early AI outputs and worried about quality and security. Some had spent years honing manual processes and felt the ground shifting under their feet.

Within IgniteTech, the most technically skilled profiles were often the most reluctant to rebase their work on AI.

For Vaughan, this resistance became a red line. He judged that he was losing precious time trying to convert people who did not want to change. At that point, his framing shifted from training to triage.

The mass firing: 80% out, new AI‑native talent in

Over roughly twelve months, IgniteTech replaced nearly four out of five employees. The new hires were selected not just for technical abilities, but for their appetite to work hand‑in‑hand with AI systems.

In his telling, the decision was emotionally tough but strategically simple. He believed that the longer the company carried AI sceptics, the lower its chances of staying competitive in an AI-heavy software market.

Vaughan later said changing people’s mindsets was harder than adding new skills, and that he would make the same call again.

He now claims the bet paid off. IgniteTech’s margins are reportedly approaching 75%, and the company has filed at least two patents for AI-based solutions. Those numbers are hard to independently verify, but they are already being cited in boardrooms as a case study in aggressive AI adoption.

What changed inside IgniteTech after the purge

According to Vaughan, three major shifts followed the staff overhaul:

  • Product development cycles shortened as AI tools were used across coding, testing and documentation.
  • Customer support and implementation were partially automated, cutting labour costs and response times.
  • Internal decision‑making leaned on AI‑generated analysis, reducing manual reporting and forecast work.

The company repositioned itself as an “AI‑first” software provider, promising clients faster releases and lower costs. That narrative, combined with a leaner cost base, helped push up profitability.

A growing pattern in big tech, with a twist

Vaughan’s story is extreme, but the broad direction matches what larger tech firms are doing with more PR polish. Amazon, Microsoft and Meta have all reorganised teams around AI initiatives, trimming staff in older product lines while hiring aggressively in AI research, infrastructure and tooling.

The difference is that those firms rarely say the quiet part out loud. They frame cuts as restructuring or “shifting priorities”, not a direct referendum on who is willing to use AI at work.

IgniteTech turned AI enthusiasm into an explicit hiring and firing criterion, instead of a soft competency on a job description.

Interestingly, Vaughan himself says he does not recommend that other CEOs copy his playbook. He describes the experience as “extremely difficult”, financially and emotionally, and stresses that mass firings were not the original goal. The intention, he insists, was to upskill, not purge.

The ethical and practical fault lines

His story raises uncomfortable questions for both employers and employees.

Issue Risk Potential benefit
Forced AI adoption Loss of trust, fear‑driven culture Faster organisational learning
AI as firing criterion Claims of unfair dismissal or discrimination Highly aligned, AI‑fluent workforce
Heavy automation Job losses, skills obsolescence Higher margins, new AI‑centric roles
Rapid retraining Burnout, shallow learning Employees gain portable, in‑demand skills

Labour lawyers point out that tying job security so closely to the use of a specific class of tools can open companies up to legal risk, especially if older or disabled workers are disproportionately affected. Unions are already starting to negotiate AI clauses into contracts, covering training, data use and automation thresholds.

What “AI-compatible” really means at work

The phrase “AI‑compatible” sounds buzzy, but in practice it usually means a mix of attitudes and habits rather than deep technical expertise.

In many office settings, AI‑compatible staff tend to:

  • Actively test tools like ChatGPT, Claude or internal models on routine tasks.
  • Track where AI helps, where it fails and how to adjust prompts or workflows.
  • Flag ethical and security issues early instead of ignoring them.
  • Share working tactics with colleagues instead of treating them as personal shortcuts.

For most roles, that level of engagement matters more than being able to build models from scratch. Companies like IgniteTech are effectively betting that culture beats pure technical background: a less experienced worker keen on AI might be more valuable than a seasoned engineer who refuses to touch it.

How this could play out for ordinary employees

Vaughan’s story invites a blunt thought experiment for anyone in a white‑collar job. Imagine your employer announces that, within 12 months, every team will be expected to deliver the same output with 30% less time, thanks to AI tools. Performance reviews are updated to measure not just results, but how much you leveraged AI.

In that scenario, three rough groups tend to form:

  • Accelerators – people who lean into AI, automate boring tasks and often end up mentoring others.
  • Pragmatists – those who use AI only where clearly useful, and look for clear guardrails.
  • Holdouts – employees who avoid AI entirely, often citing quality, ethics or job security fears.

IgniteTech’s story is what happens when leadership openly decides that the third group is no longer welcome. Most companies will likely move more slowly, but the direction of travel is similar: AI usage is sliding from “nice‑to‑have skill” towards baseline expectation.

Key concepts behind the clash

Two terms sit quietly behind this whole saga.

Prompt engineering: This is the practice of crafting and refining instructions for AI tools so they produce reliable outputs. Good prompts often specify role, tone, format, context and constraints. Employees who learn this quickly can turn generic models into powerful assistants tailored to their daily tasks.

Existential transformation: When Vaughan called IgniteTech’s AI shift “existential”, he meant the company’s survival, not abstract philosophy. For many software vendors, AI can undercut existing products while simultaneously offering new ones. Failing to react fast enough can mean being outpriced or out‑featured by rivals that automate sooner.

The tension is obvious: what feels like a strategic necessity at board level can feel like a rushed experiment at desk level. IgniteTech chose to resolve that tension with an almost brutal clarity. Whether that becomes a template or a cautionary tale will depend on how many other CEOs decide AI is worth losing most of their people for.

Scroll to Top