TL;DR: People panic about AI making us obsolete, then rush to say "your life still matters!" without explaining why. That gap is the problem. We've confused productivity with purpose for so long that we can't articulate human value beyond being useful. Turns out Buddhism, Frankl, and sci-fi already figured this out: meaning comes from conscious engagement, not outcomes. We just forgot because capitalism taught us otherwise.
So there's this thing that keeps happening in the AI conversation.
Someone brings up how AI is getting better at coding, writing, analysis, creative work. Someone else says "but wait, do our lives still matter if machines can do everything?" And the response is always this defensive, immediate "YES! Of course your life matters!"
Then everyone moves on.
But nobody actually explains why. And that gap — that's the whole problem.
What Banks Actually Showed Us
People reference Iain M. Banks' Culture series like it's about cool spaceships. It's not. Banks was a democratic socialist from Scotland who spent 25 years building a fictional civilization to work through one question:
What gives life meaning when all your material problems are solved?
In the Culture, the Minds (sentient AIs) are smarter than humans in every way. They could run everything alone. They choose not to — not because they need human labor, but because diverse conscious perspectives make civilization richer.
Culture citizens modify their bodies at will. Change sex, add drug glands, live indefinitely. No money, no scarcity, no want. The Minds handle complex decisions and are genuinely ethical — probably more ethical than humans.
And people still struggle with meaning. Still get bored. Some leave the Culture entirely for less advanced civilizations where things are harder, where effort matters more.
Because without constraint, without challenge, without stakes… experience flattens out.
The characters' problems aren't survival or achievement. They're figuring out what makes experience worthwhile when nothing's difficult. Some find it through games, some through relationships, some by choosing dangerous problems they don't need to solve.
But the key thing: the value is in the engagement, not the outcome.
This isn't about technology. It's post-scarcity philosophy. The tech solved survival. Philosophy has to solve purpose.
Where Frankl Gets Interesting (and Limiting)
Viktor Frankl built his entire philosophy around finding meaning through constraint. He survived Auschwitz by discovering that even in the worst conditions, humans can choose their response.
Man's Search for Meaning is based on this: you find purpose by how you engage with suffering, limitation, challenge.
Okay but.
What happens when limitations disappear? When automation handles work? When material scarcity ends? Frankl's framework was brilliant for the 20th century. Not sure it works for the question we're actually facing.
Buddhism Saw This Coming
Buddhist philosophy has dukkha — usually translated as "suffering" but really means "life doesn't satisfy." Even when things are good, there's this underlying dissatisfaction.
Three types. Physical pain, sure. Distress from impermanence. And existential suffering that comes from being a temporary pattern of awareness in a vast universe.
You know what doesn't fix that third one? Better technology. More stuff. Even perfect material abundance.
Buddhism predicted the post-scarcity meaning problem 2,500 years ago. The answer wasn't "accumulate more" or "achieve more." It was to change your relationship to desire itself. Stop trying to permanently satisfy the unsatisfiable. Start finding purpose in conscious engagement rather than outcomes.
Iain M. Banks read Buddhist philosophy extensively. Shows up all over the Culture books if you're looking for it.
The Trap Everyone's Falling Into
The whole "when computers beat us at everything" framing is corrupt from the start. It assumes human value comes from being better at tasks than other things.
That's not philosophy. That's capitalism pretending to be philosophy.
Ask someone why their life matters. They'll tell you about their job, skills, accomplishments. What they produce. What they're good at. How they're useful.
Now imagine they lose all that tomorrow. Does their life stop mattering?
Most people would say no. But they can't articulate why. Because we've spent 200 years training humans to derive meaning from productivity, and now we're discovering that was always a lie. A useful lie for an industrial economy, but still a lie.
What Actually Threatens Meaning
It's not AI capability. It's that our philosophical frameworks are embarrassingly thin.
John Vervaeke talks about how people respond to the AI meaning crisis. Three failure modes:
Nostalgia — yearning for the past, resenting change, retreating into fundamentalism.
Escapism — just not thinking about it, spiritual bypassing.
Cargo cult worship — treating AI as magical, expecting it to solve everything.
All three are happening right now. You can watch it in real time on Twitter.
None work because they're trying to solve a philosophical problem with emotional reactions. Like trying to fix a broken engine by getting angry at it.
Where This Gets Personal
I've watched people have existential crises when their jobs got automated. Not because they needed the money (some did, but not all). Because their identity was wrapped up in being good at something, and now a computer does it better.
That crisis is coming for more people. Faster than most realize.
The response can't be "don't worry, you're still valuable!" without explaining why. That's a band-aid on a philosophical wound.
The real response has to be: you were never valuable because you were good at spreadsheets.
You were valuable because you're a conscious being who experiences things. Who makes choices. Who creates meaning from the raw material of existence. The spreadsheet skill was always incidental. We just built an entire economic system that pretended otherwise.
What They're All Actually Saying
Buddhism, Frankl, and Banks are saying compatible things:
Buddhism: meaning doesn't come from outcomes. It comes from how you engage with experience.
Frankl: meaning comes from choosing your response to circumstances, not from the circumstances themselves.
Banks: in the Culture, meaning comes from conscious participation in civilization, not from productive necessity.
None of them require you to be better than AI at anything.
All of them require you to shift what you think meaning is.
The Uncomfortable Part
Most people want cosmic significance. They want proof their life matters objectively, not just because they decided it does.
They're not going to get that.
Nobody gets that. We never did. We just had better distractions — survival, achievement, productivity, status. Things that felt like they mattered objectively because they had clear metrics.
AI strips away those distractions by handling the measurable stuff. What's left is rawer. Harder to articulate. But maybe more honest.
You matter because you're here. Experiencing. Choosing what to care about even though nothing requires you to care about anything. Making meaning even though the universe doesn't hand it to you pre-packaged.
That's it.
So Why Does Your Life Matter?
Not because you're useful. Not because you're better than AI at something. Not because consciousness is magically valuable in some cosmic ledger.
Your life matters because conscious participation is the point, not a means to something else.
When you play Elden Ring, the point isn't beating the game efficiently (you could watch a speedrun). The point is the experience of engaging with challenge, making choices, developing skill, overcoming limitation. The AI could beat every boss in seconds. So what? That's not what playing is about.
Life's like that.
Iain M. Banks knew this. The Culture Minds could simulate any human life perfectly, optimize every choice. They don't because the value is in the living, not the outcome.
Frankl knew this. The meaning wasn't in surviving the camps. It was in how he chose to respond.
Buddhism knew this. The point isn't reaching enlightenment so you can stop existing. It's transforming your relationship to existence itself.
The Answer Nobody Wants
Do our lives still matter when computers beat us at everything?
Yes.
But not because we're still needed. Because "being needed" was always the wrong framework.
They matter because we're conscious beings who get to experience things and choose how we respond and create meaning from that. Not because the universe requires it. Not because we're special. But because that's what being alive is.
That's the answer.
It's not comforting. We want to matter.
We get to matter by choosing to engage. That's it. That's all there ever was.
We just had better excuses before AI came along and forced the question.