Resistance to AI — A Proportionate Response

For the past couple of years, I’ve been noticing something in my own work that I want to name plainly. The deeper I go into the question of human purpose in the AI era, the more I find AI itself has become my co-creator in asking it. Not a tool I pick up. A thinking partner I work with. Sounding board, sparring partner, second mind. The questions I’m asking about purpose are being shaped, sharpened, sometimes outright reframed in conversation with AI. I am, by any honest accounting, becoming more deeply engaged with AI the further I go into the most human question I know.

This would have surprised me a few years ago. It doesn’t anymore. And it’s pushed me toward a thought I want to put down clearly.

The phrase proportionate response is in the air right now, and not in an abstract way. War is being waged. Political violence is no longer a fringe concern. Ordinary people, watching the news, are thinking — sometimes for the first time — about what it means for a response to be measured, calibrated, proportional to what provoked it. The phrase has weight again.

That weight is worth borrowing.

Because there is another conversation, running alongside the harder ones, where the same concept applies and is almost never invoked correctly. The cultural posture toward AI has hardened, in many quarters, into something defensive — and increasingly into something more than posture. An incendiary device thrown at the gate of an AI executive’s home.¹ State legislatures across the country racing to pause or ban the infrastructure AI requires.² Federal lawmakers proposing nationwide moratoria.³ This is no longer mood. It is action. And it is being taken in the name of a measured, proportionate concern about what AI is doing to society.

The concerns driving the resistance are not imaginary. Job displacement and the economic dislocation that follows it. Anxiety about what happens to human meaning when machines can do what once required us. Water consumption at hyperscale data centers. Rising energy costs absorbed by ordinary ratepayers. The strain on local power grids in communities that did not choose to host the infrastructure of an industry they barely understand. These are real things, and pretending otherwise would be its own form of dismissal.

The trouble is that the underlying logic of the response is not actually proportionate. It only looks that way.

Consider where our cultural intuition about proportionate comes from. It rests, knowingly or not, on Newton — for every action, an equal and opposite reaction. That’s where the “equal and proportionate” quietly lives. AI displaces jobs, therefore push back against AI. The math feels right because the model feels right. Action, reaction, equilibrium restored.

But Newtonian cause-and-effect is not the right model for what is actually happening. Aristotle understood, and contemporary thinkers like John Vervaeke have been recovering the point, that motion is shaped not only by causes but by constraints — the geometry of what is possible at all. A force without an account of constraints tells you almost nothing. Scarcity, in this older and more accurate frame, is not a feature of the material world. It is a constraint pattern. It shapes the field of human possibility by what it forecloses, not by what it actively does.

AI is not a new force pushing against humans. It is, structurally, the removal of constraints that have historically suppressed human flourishing for the vast majority of people. Medicine becoming radically more accessible. Hunger becoming structurally solvable. Education that meets each child where they actually are. Diagnostics in places that have never had doctors. Tools for human creation handed, at near-zero cost, to people who would never otherwise have had them. These are not promises. They are unfolding, happening, now, in real time.

You do not respond proportionately to a wall coming down by pushing back against the empty space.

This is where the anti-AI posture fails its own test. Proportionality, in any honest ethical sense, means weighing the full stakes on both sides of an action — not just the visible losses, but what is foregone by inaction. What is the opportunity cost? When the foregone side of the ledger contains civilizational possibilities of that magnitude, a defensive response is not proportionate. It is disproportionate in the opposite direction. The reflex to slow, restrict, retreat — applied honestly to what is actually being weighed — fails its own ethical premise.

The proportionate response to AI is to engage it. Carefully, responsibly, and fully.

This is not techno-optimism. Techno-optimism waves away the hard parts. The displacement is real. The disorientation is real. The civilizational adjustment is real and uncomfortable and will not be short. There is a reason eyes adjusting to brighter light experience the brightness as pain before they experience it as illumination. None of that is to be dismissed.

But the question is what to do with the discomfort. And here the proportionate-response frame, properly understood, has more to offer than the resistance admits. Because once you accept that the stakes on the other side are civilizational, the question changes shape. It is no longer should we engage? It becomes how do we engage well enough to deserve what AI is making possible?

That second question is the one I find myself living inside. It is why my own work has moved closer to AI rather than further from it. The answer to how do we engage well turns out to be inseparable from purpose. A person without a clear sense of what they are for, handed god-like creative tools, is not empowered. They are overwhelmed. The tools amplify whatever is already there. If purpose is there, AI amplifies purpose. If confusion is there, AI amplifies confusion. Scale always finds what is underneath it.

This is why I think the grassroots level is where this gets decided. Not in policy chambers. Not in corporate boardrooms. In the lived discovery of personal purpose, one person at a time, one community at a time. Institutions calibrated to scarcity will not distribute abundance for us. They are not built to. The work is human, local, and specific — and the AI capacity that makes the work possible is already in everyone’s hands.

The proportionate response to a moment of civilizational possibility is not retreat. It is the patient, deliberate work of becoming the kind of people who can carry it.

That is what I am trying to do in my own corner of it. The fact that I am doing that work with AI — humanity in the loop, not merely human-in-the-loop — is no longer a contradiction to me. It is the point.


Notes

¹ On April 11, 2026, a 20-year-old man threw an incendiary device at the gate of OpenAI CEO Sam Altman’s home in San Francisco, then went to OpenAI’s Mission Bay headquarters and attempted to force entry. He was arrested and booked on suspicion of attempted murder, criminal threats, and possession of a destructive device. The previous November, OpenAI employees had been told to shelter in place after a separate threat at the company’s offices. See Washington Post, “Attack on Sam Altman’s San Francisco home prompts fears of AI division” (April 14, 2026); Fortune, “Attacks on Sam Altman’s home are extreme. But the AI backlash is going mainstream” (April 16, 2026).

² At least eleven U.S. states have introduced legislation since late 2025 to restrict, pause, or ban new data center construction. Maine became the first state poised to enact a statewide moratorium, with its legislature passing a bill in April 2026 halting new projects requiring 20 megawatts or more of power until November 2027. Similar moratorium bills are active in Georgia, Maryland, Minnesota, New York, South Dakota, Vermont, Virginia, Wisconsin, and others. Between April and June 2025 alone, twenty proposed data center projects worth a combined ninety-eight billion dollars were blocked or delayed by local resistance. See Axios, “These states don’t want data centers in their backyards” (April 5, 2026); Built In, “States Push Data Center Moratoriums as AI Growth Surges” (April 2026); Good Jobs First, “Data Center Moratorium Bills Are Spreading in 2026.”

³ At the federal level, Senator Bernie Sanders and Representative Alexandria Ocasio-Cortez introduced the Artificial Intelligence Data Center Moratorium Act of 2026 (S. 4214) on March 25, 2026, calling for an immediate nationwide pause on new AI data center construction until comprehensive federal safeguards are enacted. See Office of Senator Bernie Sanders, press release of March 25, 2026.

Humanity in the Loop

A close friend — a nurse practitioner whose life’s work sits at the intersection of oncology and spirituality in healthcare — shared something with me recently that I haven’t been able to put down.

She was describing the moments she witnesses at the bedside. The ones that don’t make it into charts or protocols. The moments where something shifts — not because of a treatment or a technology, but because one human being chose to truly meet another. She called it loving-kindness. Not as a sentiment. As a practice. A daily, renewable choice.

Every day we encounter people who challenge us. We feel irritated, misunderstood, too busy. And in that moment we have a choice: reframe toward connection, or turn away and remain fixed in our stance. Loving-kindness, she said, is rarely spoken. It lives in the tone of voice. The softness in the eyes. The willingness to stay present.

I keep thinking: this is where everything begins.


There is a word that gets used constantly in the conversation about AI and the future: abundance. It is usually followed by statistics. Productivity gains. Cost curves. Vertical farms producing 360 times the yield per square foot on 95% less water. These numbers are real and they matter.

But abundance is not only an economic event. It is a civilizational one. And civilizations are not built from the top down. They are built from the quality of the encounters between people — one moment of choosing connection over withdrawal, multiplied across billions of lives.

We are living through a transition unlike anything in recorded history. Some of us were born into a world of genuine scarcity — where resources ran out, where opportunity was rationed, where survival competed with flourishing. And some are being born right now into a world that will never know that logic. They will inherit tools and possibilities their grandparents could not have imagined.

We are the bridge generation. We stand in the middle — carrying the memory of scarcity in our bodies while moving into a world that operates by a different set of rules. That position is not an accident. It is a responsibility.


There is an important distinction being missed in almost every conversation about AI.

Some resources are extractive. Oil. Coal. Minerals. Every unit consumed is a unit gone. The world that built our institutions, our economic systems, our psychological defaults — that world was built on extractive logic. Scarcity was not a mistake. It was physics.

But there is another category of resource entirely. The sun does not deplete when it gives. A forest, wisely managed, renews itself indefinitely. Knowledge compounds. Capability builds on capability. Exponential technologies belong to this second category — they are generative, not extractive. The model isn’t a pie being divided. It’s a pie that expands as more people reach for it.

The problem is that we are trying to receive a generative abundance with an extractive mindset. We are asking scarcity questions in an abundance world. And no amount of data will fix that. Because data is not wisdom.

This is what Socrates understood and what we keep forgetting: science gives us truth. Accurate, verifiable, extraordinary truth. But truth without relevance is inert. It does not move us. The transformation — the actual shift from scarcity thinking to abundance living — is not an information problem. It is a wisdom problem. It is a human problem. It requires the work that no system can do for us.


In the ancient philosophical tradition, the psyche was understood not merely as an inner life but as a mover. Self-moving, and capable of moving things beyond itself. The foundation stone beneath the Temple Mount in Jerusalem is described in similar terms — the thing at the center of the world, hidden and weighty, around which everything orients.

Purpose works this way. When a person discovers what they are genuinely here to do, they do not simply feel better. They become a different kind of force in the world. They begin to move things.

This is what is at stake in the AI era — not whether the technology works, but whether the humans wielding it are themselves purposeful. Whether we are moving things, or being moved.


There is a phrase used in technology circles: human in the loop. It describes a system design where a person remains involved in decisions — a safeguard, a checkpoint, a corrective.

I want to propose something different. Not human in the loop. Humanity in the loop.

Because the risk is not only that AI makes bad decisions. The risk is that we automate away the very qualities that make decisions worth making — the loving-kindness, the wisdom, the capacity to meet another person in their full reality and choose connection over withdrawal.

We cannot stop what is coming. And we should not want to. But we are not passengers. We are links in a chain — one of the most critical links in the entire history of civilization.

There are perhaps a few hundred people in the world making the foundational decisions about how AI is built. Their thinking, their wisdom, their sense of responsibility matters enormously. But the direction AI actually takes — the values it serves, the lives it shapes, the civilization it helps build or erode — will not be determined by them alone. It will be determined by all of us.

Not primarily through legislation or protest. Through application. Through the nurse who uses AI to spend less time on paperwork and more time at the bedside. Through the small business owner who leverages it to reach a market they could never have accessed alone. Through the teacher, the farmer, the caregiver, the entrepreneur in a city that has never before had access to these tools — each one deciding, consciously and purposefully, what they are going to build with what abundance is making available.

That is the point of humanity in the loop. Not a political movement. A human one. Every person who brings their purpose into contact with these tools multiplies both. Every community that decides what flourishing means for them, and uses every available resource to get there, is doing exactly what this moment in history is asking of us.

That work begins in small places. At a bedside. In a difficult conversation. In the choice, made again this morning, to meet the world with open eyes and an open heart.

The pie is expanding. The question is whether we are becoming the kinds of people who can truly receive it — and give something worth multiplying back.

The Conductor’s Podium Is Empty. And Waiting!

I sat down for coffee this morning next to someone I hadn’t seen in a while.

She’s experienced. Capable. She spent years building things in the startup world — managing people, navigating complexity, holding organizations together when everything was moving fast and resources were thin. Real work. Hard-earned expertise.

She’s looking for a job now.

And the conversation left me sitting with something I couldn’t shake for the rest of the day.

The roles she’s qualified for are shrinking. Not because she lacks ability — but because AI is quietly absorbing the entry points, the mid-level positions, the rungs of the ladder people spend careers climbing. It isn’t just where you start that’s changing. It’s whether the ladder itself still exists in the form we’ve always known it.

She doesn’t need a better CV.

She needs to understand that the world she’s trying to enter has already been replaced by a larger, more open one.

And she is not alone. What she is experiencing is not a personal career setback. It is a signal — arriving at café tables and inbox rejections and awkward performance reviews all over the world — that something structural has shifted. The woman across from me this morning is standing at the edge of a transformation that is rewriting the rules for everyone, at every stage, in every field.


Here’s the frame I keep returning to.

The old equation governed everyone — not just the entrepreneur dreaming of a startup, but the professional climbing a corporate ladder, the freelancer chasing clients, the mid-career expert waiting to be recognized, the employee who traded autonomy for the security of a steady role. Every position in the value chain was a different strategy for managing the same underlying condition: resources were limited, access was controlled, and you organized your working life around getting close enough to both.

Most visions — entrepreneurial or otherwise — died somewhere in that structure. Not for lack of passion or capability. For lack of position. For lack of runway. For lack of permission from someone further up the chain who held what you needed.

We celebrated the people who navigated it successfully as a special category of human: the entrepreneur. Risk-taker. Visionary. The one who could absorb what others couldn’t.

But I think we misread them.

They weren’t drawn to the risk. They were willing to absorb it in service of something they believed in. The risk was never the feature. It was the tax. And the rest of us — the employed, the climbing, the hustling, the waiting — were paying a different version of the same tax. We called it compromise. We called it patience. We called it being realistic.

AI is eliminating the tax. For everyone.

Execution costs are collapsing. Access to tools, talent, and infrastructure that once required significant capital or institutional backing is approaching zero. That old consolation of optimists — the impossible just takes a little longer — is quietly becoming a literal statement of fact. The impossible is now within reach, and arriving faster than anyone has fully adjusted to.

It has been said: “You have a purpose and you’re motivated — you can go out and do anything you want now.”

Read that not as inspiration. Read it as a description of a new reality. One that applies not just to founders and visionaries — but to everyone who ever had something they wanted to build, contribute, or express, and found the old structure standing in the way.


Which means the conductor’s podium has been empty for a while now.

Most people just haven’t noticed it yet.

The person who thrives in the AI era looks nothing like the archetypes the old structure produced. No longer a risk-absorber. No longer a scarcity manager. No longer someone who survives the gauntlet through sheer force of will, the right connections, or proximity to capital. No longer someone waiting to be chosen.

More like a conductor.

Someone who knows what the music should sound like, and has the judgment to direct the ensemble toward it. The ensemble — the agents, the tools, the resources that are now abundant and largely free — takes direction. What it cannot do is supply the intention.

That’s the human job now.

And intention is another word for purpose.


So back to the woman at the café.

She isn’t losing a seat at a shrinking table. She’s standing in front of a conductor’s podium — open, waiting, hers — and wondering why nobody has offered her a chair.

The chair was never the point.

What she needs isn’t a job description written by someone else. What she needs is the internal permission to step up and say: I know what music I want to make. Her skills aren’t obsolete. The role she’s reaching for — fitting herself into someone else’s organizational chart, waiting to be chosen, fulfilling someone else’s vision of what her contribution should look like — is a solution to a problem that is rapidly ceasing to exist.

Most of us have spent our careers doing exactly that — and calling it a career. The structure rewarded it. The structure was built around it. Show up, perform within the defined boundaries of someone else’s vision, secure your place in the chain.

The podium asks only one thing: that you know what you’re here to direct.

This is the conversation we are not having loudly enough.

AI isn’t just changing what’s possible. It’s changing who gets to do the possible. But only for people who know what they want to do with the possible. That’s the gap. Not technology. Not access. Not even opportunity.

The gap is self-knowledge.

And closing that gap — finally, urgently — is the most practical work any of us can do right now.