AI Giving Us Free Time Is the Most Psychologically Violent Thing That Could Happen to Us
Table of Contents▼
I used to think the dream was to stop working. I used to lie awake fantasizing about a life where I simply existed and everything got handled and I was free to just be alive. Then I had a sabbatical for three months, genuinely free time with money covered and obligations lifted, and by week five I was bargaining with myself to find problems to solve because the silence inside my own head was genuinely unbearable. That experience taught me something I have been sitting with ever since. The dream of freedom and the reality of freedom are two completely different psychological events.
What AI is actually threatening to do is expose the single greatest lie modern civilization has been running on. That lie is that humans want leisure. We say we want leisure. We talk about beaches and retirements and long holidays. The truth is that leisure only feels good as a contrast to effort, the way sleep only feels good because you were awake. Strip away the effort permanently and leisure stops being a reward. It becomes the only texture of existence, and the human nervous system was sculpted over millennia to find that texture deeply, quietly horrifying.
Here is the perspective almost every think piece misses completely. The people who will suffer most in an AI abundance future are the highly educated, high achieving, deeply career-identified people. The ones everyone assumes will adapt gracefully because they are smart and resourceful. Those people have the most elaborate identity structures built entirely around professional contribution, and those structures will collapse the hardest. The carpenter who worked with their hands might find genuine joy in growing food or building things for pleasure. The consultant who optimized supply chains for thirty years has no such translation available.
I keep thinking about what boredom actually is at a neurological level, because I think people misuse that word. Boredom is the brain signaling that it is capable of more than it is currently being asked to do. It is not laziness. It is the mind demanding a worthy challenge. Every human being who has ever lived has had the economic survival structure to provide that challenge automatically and universally. AI removing that structure does not remove the underlying neurological demand. It just removes the thing that was satisfying it by default, and leaves billions of people with a hungry, restless brain and absolutely nothing culturally established to feed it.
The spiritual traditions saw this coming centuries before Silicon Valley did. Every serious contemplative practice in human history has essentially been training for how to be present without external justification.
Meditation, monasticism, philosophy as a way of life. These were technologies developed to answer the question of how a human being maintains dignity and aliveness when the usual economic and survival pressures are removed. The genuinely wild thing is that those traditions spent thousands of years trying to get followers and largely struggled to grow, because almost nobody needed to learn those skills urgently. AI is about to make those skills the most critical survival technology on earth, and we have spent the last hundred years actively dismantling the institutions that taught them.
There is a very specific psychological phenomenon I expect to see emerge that I have started calling productivity grief. It will present like depression but the actual mechanism will be closer to bereavement. People will be mourning a version of themselves that had an automatic answer to the question of why they mattered. That version of themselves was maybe exhausted and stressed and overworked, but it was never confused about its own relevance. Relevance is a more fundamental human need than comfort, and we have designed an economic system that delivered relevance as a side effect of participation. Removing participation removes the relevance delivery mechanism, and the grief that follows will be something psychiatry is almost entirely unprepared to treat.
The most honest thing I can say is that I think a small percentage of humans are going to absolutely flourish in an AI abundance future, and I think that percentage will look very different from who we currently assume it will be. The ones who flourish will be people who already live from intrinsic motivation, who make things because the making itself feeds them, who pursue understanding because curiosity is its own reward, who invest in relationships with the same intensity others invest in careers. These people exist right now and are usually considered mildly impractical by the productivity-obsessed mainstream. They are about to become the template for psychological survival.
The deepest question I sit with is whether meaning can be self-generated at scale, or whether meaning requires scarcity and stakes to feel real. A painting feels meaningful partly because making it cost the painter something. A conversation feels meaningful partly because both people chose to spend irreplaceable time on it. If AI removes scarcity of resources and scarcity of time simultaneously, I genuinely wonder if humans retain the psychological architecture to locate meaning at all, or if we will have to build that architecture from scratch, collectively, with no historical map, which is either the most exciting or the most frightening project our species has ever undertaken.