The Last Human Skill
Table of Contents▼
I started thinking about this when I was considering whether to pursue AWS certifications. The honest question I was sitting with was simple: if AI is already handling the majority of software development work, if it can retain more context than any human developer, if it can search, synthesize, and execute faster than I ever could, then what exactly am I training for? And the answer I arrived at surprised me. The certifications still matter, but not for the reason most people think. They matter because they teach you judgment. They teach you architecture, system design, when to use which tool and why. That's the layer AI still needs humans for. You're not competing with AI on execution anymore. You're competing on decision-making, and that's a game worth playing.
But that realization opened a much bigger question for me. I've always been the kind of person who likes to learn broadly, across multiple niches and genres, building a portfolio that spans psychology, cloud infrastructure, design, philosophy, and whatever else genuinely interests me. And I started wondering whether that kind of broad, fluid learning actually protects you in an AI-dominated world, or whether it leaves you perpetually behind, always chasing skills that are being automated faster than you can acquire them. It felt like playing that arcade game where animals pop up from underground and you have to hit them before they disappear, except the game keeps getting faster and there are more holes than you have hands.
Here's what I concluded though. The people who will thrive aren't the ones who chase individual tools. They're the ones who understand the underlying domains those tools serve. When AWS gets automated, you don't need to learn AWS anymore. You need to understand what problems cloud infrastructure solves and when different architectural approaches make sense. That domain knowledge is portable in a way that tool knowledge never is. So the broad portfolio isn't a liability. It's actually the entire point. It makes you someone who can synthesize ideas across fields, someone who asks better questions, someone who sees connections that specialists miss.
Then I pushed the question further. What happens when we move past AI as we know it today, past execution-level intelligence, into Artificial General Intelligence and eventually Artificial Superintelligence? What happens when machines don't just execute tasks but actually think, create, strategize, and feel their way through problems the way humans do? At that point, the question stops being about which skills to develop and becomes something much more existential. It becomes about what humans choose to do with their existence when survival is no longer the primary organizing force of daily life.
This is where I think we have a serious psychological problem that nobody is talking about honestly enough. Our entire current understanding of productivity is built on the idea that you have to be constantly doing something measurable and valuable. You have to be producing. You have to be contributing. And if you're not, something is wrong with you. That belief is so deeply embedded in how we're raised and educated that free time, genuine unstructured free time, actually triggers anxiety and depression in a lot of people rather than creativity and joy. AI giving us free time sounds like a gift until you realize that we haven't built the psychological infrastructure to receive it. We don't know what to do with ourselves when we're not being productive in the traditional sense.
This is why I think the redefinition of productivity is the most important cultural shift that needs to happen alongside technological development. Productivity in a post-AGI world can't mean output per hour. It has to mean something closer to depth of experience, quality of connection, richness of learning, and contribution to meaning. And the education system, as it currently exists, is almost completely unequipped to prepare people for that shift. The major-minor university structure assumes you're specializing for a job market. It creates silos when the future rewards bridges. It optimizes for credentials when the future will reward demonstrated adaptability and creative synthesis. I think universities will either adapt into something much more interdisciplinary and self-directed, or they'll slowly become irrelevant as portfolio-based, curiosity-driven learning becomes the dominant model.
If I were designing an education system for the world that's coming, I'd build it around a completely different set of foundations. The first would be teaching people how to think and how to ask good questions rather than how to memorize and repeat answers. The second would be meta-skills, which are the portable thinking patterns that transfer across domains: systems thinking, pattern recognition, learning how to learn, knowing how to collaborate intelligently with AI rather than blindly following it or irrationally avoiding it. The third would be philosophy, ethics, psychology, and the humanities, because these become more important, not less, when survival isn't the daily concern. When you have time and freedom, the question of how to live well becomes the central question of human existence. And the fourth would be emotional intelligence and the capacity for genuine human connection, because those are things that machines can simulate but never authentically provide.
Speaking of which, there's a category of skills I think about a lot that sits completely outside the AI conversation in a meaningful way. Playing piano. Playing violin. Cooking a meal from scratch for someone you love. Painting. Writing poetry. These skills aren't valuable primarily because of their output. They're valuable because of the process and the humanity embedded in them. When you hear someone play violin with genuine emotion, the slight imperfections, the breath behind the notes, the feeling that a living person is communicating something to you, that's not a bug. That's the entire point. And as AI gets better at producing technically perfect outputs, human imperfection and authenticity will paradoxically become more valuable, not less. A handwritten letter becomes more precious when everyone uses AI to write. A live performance becomes more meaningful when AI can generate studio-perfect music instantly. These aren't just hobbies. In the world that's coming, they might be the primary currency of human value.
This brings me to an economic observation that I think is genuinely underappreciated. Right now, running an AI company is extraordinarily expensive. The compute costs, the energy infrastructure, the servers, the constant iteration, it burns money at a rate that most businesses can't comprehend. And people talk about this as a temporary problem, something that will resolve as technology matures and costs come down. But here's the flip side of that trajectory that nobody seems to be discussing. As AI becomes cheaper and more ubiquitous, human labor becomes scarce and premium. In a fully automated world, running a company staffed entirely by humans becomes the expensive, rare, extraordinary thing. It becomes the luxury product.
You can already see this pattern emerging in small ways. Handmade furniture costs ten times more than factory-produced furniture. Farm-to-table restaurants charge significant premiums because humans grew, harvested, and prepared everything. Bespoke tailoring, human therapists, live musical performances, locally crafted goods — all of these carry premiums not because their outputs are necessarily more functional but because the human effort and presence behind them is increasingly rare. In an ASI world, a fully human company becomes the ultimate luxury brand. And the skills that make you irreplaceable in that context aren't your AWS certifications or your coding ability. They're your creativity, your emotional depth, your authentic presence, your ability to connect one human being to another in a way that feels real.
Now, will all of this mean the end of AI? I don't think so, and I think that's actually the wrong question. AI and human enterprise won't be competing — they'll be serving entirely different markets at entirely different price points. Like how Rolex and Casio both exist, both thriving, neither destroying the other. Beyond that, once ASI is embedded deeply enough into global infrastructure — energy grids, healthcare systems, logistics, communication networks — you can't remove it without catastrophic consequences. It becomes as foundational as electricity. The more interesting question is what the relationship between those two layers looks like: ASI handling everything at the base level with ruthless efficiency, and human creativity, craft, and connection operating as the premium layer that people aspire toward and pay significantly for.
From a pure career guidance perspective, if I had to distill everything into practical direction for someone entering the workforce today, it would be this. Don't specialize so deeply in a single tool or technology that your entire value proposition depends on that tool remaining relevant. Instead, invest in understanding the domains and problems that tools serve. Build genuine cross-disciplinary knowledge, not for the sake of collecting credentials, but because the ability to synthesize ideas across fields is becoming one of the rarest and most valuable capabilities a person can have. Develop at least one deeply human skill, something physical, creative, or performative that requires your body and your presence and your authentic emotional investment. And learn how to use AI as a collaborator rather than treating it as either a threat to avoid or a crutch to depend on entirely. The people who understand both the power and the limitations of these tools will direct them far more effectively than people who either fear them or blindly follow them.
The deeper career guidance, the kind that goes beyond which certifications to pursue, is about building psychological resilience for a world where your professional identity can't be the entire source of your self-worth. That shift is coming whether we prepare for it or not. The people who will navigate it most gracefully are the ones who have cultivated meaning across multiple dimensions of their lives simultaneously — in their relationships, their creative practices, their intellectual curiosity, and their capacity to sit with uncertainty and still feel purposeful. That's not soft advice. In the world that's coming, that's the most practically valuable thing a person can develop.
I don't know exactly what the world looks like on the other side of AGI and ASI. Nobody does, and I'd be suspicious of anyone who claims otherwise. But I do know this: the question of what humans do when machines do everything is ultimately a question about what we actually value when survival is no longer the organizing principle of our days. And my honest answer, after sitting with this for a long time, is that we return to the things that were always most human. We make music. We tell stories. We build relationships. We ask questions that don't have clean answers. We find meaning in the process of learning and creating rather than in the products we produce. We become, in the fullest sense of the word, alive. And maybe that was the point all along.