We have talked about the research, mapped the types, traced the history. This post is about the people.

The statistics are not nothing. Sixty-one percent of workers report anxiety about AI displacing their jobs. Axios spent months documenting white-collar AI displacement across industries. Researchers at the University of Florida found measurable mental health effects in workers whose jobs were restructured around automation. These numbers matter because they show scale. But numbers also do something uncomfortable: they let you stay at a distance. They let you process this as a trend rather than as something that is happening to specific human beings right now.

So let’s close that distance.

The stories below are composites, drawn from patterns that have been extensively documented across journalism, labor research, and worker testimonies over the past two years. No individual is fabricated. These are representative portraits of what displacement actually looks like on the ground, assembled from real accounts. If you recognize yourself or someone you know in them, that is not a coincidence.


The Content Team That Stopped Existing

Before the layoffs, she was three years into what she thought was a real career. Staff writer at a mid-sized digital publisher, decent salary, benefits, a team she liked. She covered consumer tech, wrote explainers, occasionally got to do longer reported pieces. The work was not glamorous but it was hers.

The restructuring announcement came in a meeting that lasted eleven minutes. The company had adopted an AI content platform. The writing team was being reduced from eight people to three. The remaining staff would “focus on editing, oversight, and quality assurance” of AI-generated drafts. Her position was eliminated. She had two weeks.

The part that stays with her is not the meeting. It is the email she got six weeks later from her former editor, checking in, mentioning that the AI platform was producing the explainers she used to write. In 11 seconds, her editor said, half-joking. The same kind of competence that took her years to develop. Produced in 11 seconds. We talked about that moment in the first post in this series, as an abstraction. For her, it was a Wednesday afternoon and she was sitting in her apartment wondering what to put on her resume.

She has since done what she was told to do: she learned new skills. She took a course in AI prompt engineering. She freelances now, sporadically. Her income is roughly sixty percent of what it was. She describes the work as fine. She does not describe it as hers.


The Junior Developer Who Became Redundant Before He Started

He got his first job offer two months before graduation. Junior developer at a software consultancy, the kind of role that was supposed to be a foot in the door. He deferred his start date while the company “went through some structural changes.”

The structural changes turned out to be a decision to use AI coding assistants for the kind of work junior developers typically do: writing boilerplate, handling documentation, fixing straightforward bugs. The consultancy kept its senior staff. The junior roles were reclassified. His offer was rescinded. He was told the company would “keep him in mind for future openings.”

He is not the paralegal from our Type 1 scenario, updating a resume in quiet panic. He is worse off in some ways: he never got the job to lose. He has credentials and no career path. The jobs he is qualified for are being eliminated faster than he can apply for them. He has heard the advice about learning AI tools so many times that he has started responding to it with a specific silence he describes as “the pause before I decide not to say what I’m thinking.”

The advice assumes that the problem is skills. His problem is not skills. His problem is that the entry-level pipeline that has historically absorbed people like him is being removed, and the alternative pathways have not materialized. He is twenty-four years old and already describing his career as something that needs to be rebuilt.


The Call Center That Became a Footnote

She worked customer service for a regional insurance company for seven years. She was good at it. Not just technically competent: she was the person her colleagues sent difficult customers to, the one who could de-escalate a call that had gone badly, who could hear the thing behind the complaint and address that instead of the stated problem. Her manager told her this explicitly, more than once.

The company rolled out an AI-powered chatbot system over six months. The announcement framed it as augmentation, not replacement. Her role would shift to “escalation specialist,” handling only the calls the chatbot couldn’t resolve. For a while, this is what happened. Then the volume of escalations dropped as the chatbot improved. Then the escalation team was reduced. Then her position was eliminated.

The thing she keeps coming back to is the skill that made her valuable. The emotional attunement, the ability to hear what someone actually needed. That skill is invisible on a resume. There is no certification for it. When she tried to explain to a career counselor what made her good at her job, the counselor suggested she look into sales. She has not gone back to the counselor.

She is doing gig work now. She is also, she says, the angriest she has ever been in her life. Not at any individual, she is careful to say. At something more structural. The neo-Luddites we wrote about last week asked the question “who pays the cost?” She is one of the people who can answer that.


The Illustrator Who Watched Her Market Disappear

She had spent eight years building a freelance illustration practice. Editorial clients, book covers, brand work. She had a style that clients came back for. She had a waitlist. She had, she thought, insulated herself from displacement because her work was irreducibly personal. You hired her for her vision, not for generic output.

The market did not collapse overnight. It dissolved. Clients who had previously commissioned illustration started asking for “AI-generated options” first and human work only as a backup. Editorial budgets dried up as publications used image generation tools in-house. The book cover work thinned. By the time she understood what was happening, her income had dropped by half in eighteen months.

The psychological dimension here maps closely to what we called Type 8 in our taxonomy: identity anxiety. The work was not incidental to who she was. It was constitutive of it. When the market for that work contracted, she did not experience it primarily as a financial problem. She experienced it as something more destabilizing, a question about whether the thing that made her herself still meant anything.

She is still illustrating. She is also working a part-time administrative job she did not need two years ago. She describes herself as “recalibrating.” She does not say what she is recalibrating toward.


What These Stories Have in Common

Speed. Every one of these people describes the pace of change as the thing they were least prepared for. Not that AI would eventually affect their field, but that it would happen in months, not years. The standard advice assumes you have time to adapt. The standard advice is wrong.

Institutional abandonment. Unemployment insurance, in most states, was designed for a different kind of job loss. A Fortune investigation found that workers displaced by AI automation frequently do not qualify for benefits because the displacement happens through role restructuring rather than direct layoff. Career transition programs are underfunded, generic, and oriented toward the economy of five years ago. There is no infrastructure for this specific problem.

The inadequacy of “just reskill.” The advice has become a kind of tic, reflexive and useless. Learn to code. Learn prompt engineering. Learn to work with AI. This advice ignores two things. First, many of the skills being devalued are not technical; they are human in ways that resist easy substitution. Second, the jobs the advice is pointing toward are also in flux. You cannot reskill toward a stable landing point that doesn’t exist yet.

The identity crisis underneath the economic crisis. This is the thread that runs through all four stories and connects back to everything we have written in this series. The question is not only “how do I pay rent.” It is the question we closed with in the first post: what am I if this can be replaced? That is not a question career counselors are equipped to answer. It is barely a question most people know how to hold.


What Support Actually Looks Like

Not a webinar. Not a LinkedIn Learning subscription. Not a career counselor who suggests sales.

What workers in these situations describe needing: accurate information about what their options actually are, not cheerful reframing. Financial breathing room while they figure it out. Community with people going through the same thing, because the isolation compounds everything. And some acknowledgment from institutions, employers, and policymakers that the disruption is real, specific, and not a personal failing.

Some of this exists at the margins. There are worker advocacy organizations doing real work. Some union contracts are beginning to include AI impact clauses. A handful of companies have implemented genuine transition support. These are exceptions. The default response to AI displacement is to tell people to be more adaptable and then move on.

The UF researchers who studied mental health effects in AI-displaced workers found what you would expect: elevated rates of anxiety and depression, disrupted sense of purpose, difficulty with long-term planning. They also found something that gets less attention: social comparison effects, workers measuring their now-diminished economic standing against peers who had been spared. We called that Type 9 in our taxonomy. It is common, it is painful, and it is almost never named.


What Comes Next

The fifth post in this series is going to be different. After the research, the taxonomy, the history, and the human cost, we are going to do something that is probably overdue: acknowledge that sometimes you need to laugh at this, or at least sideways at it. The memes, the coping mechanisms, the dark humor that has already built up around AI anxiety. Gallows humor is not denial. It is sometimes the only rational response to an irrational situation.

But before we get there: if any of these stories landed somewhere specific for you, if you recognized something in them, that recognition is data. It tells you something about which of the nine types is yours, and it tells you that you are not navigating this alone.

The people in these stories are not hypotheticals. They are not statistics. They are, in various forms, everywhere. And the question the neo-Luddites asked, and the question these stories force into plain view, is the same one: who pays the cost? The answer, so far, is not the people who made the decision.