How I stopped worrying and learned to love the future
From someone who genuinely loves technology.
I have loved automation ever since I learned how to properly utilize it. I believe it can create a better life with less friction, fewer wasted hours, more access, more leverage for everyday people. But I want to take a step back and ask a harder question:
What future are we actually building and what are we accidentally giving away in the process?
Because the biggest risk with AI isn’t that it becomes “evil.” Sorry Terminator fans.
The bigger risk is something Strangelove captured perfectly in satire:
powerful systems become dangerous when humans stop taking responsibility for them.
Not because anyone wants disaster but because incentives, speed, and procedure quietly replace judgment.
That’s the lesson I’m carrying into how I think about what I build. And it’s why I have so much faith in Gray Collar.
The Strangelove lesson isn’t “technology bad.” It’s “abdication is.”
In Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, the world doesn’t collapse because people set out to destroy it.
It collapses because a system is built to run fast, rigid, and “correct” according to its own logic and when something goes wrong, there’s no real human override that can effectively stop the chain.
That’s the warning for AI:
When speed becomes the highest value
When automation becomes the default answer
When responsibility gets diffused (“the system decided”)
When reversal is hard, and oversight is optional
…you don’t get progress. You get momentum. And unfettered momentum doesn’t care about humans.
One of my favorite expressions: You can’t un-wrinkle a paper.
We live in the world with AI now. We need a human roadmap, not just better tools. AI is here, and adoption is truly breathtaking.
Because it’s so fast, we’re seeing a behavior shift: people trading independence and oversight for speed. Not everyone by a longshot, but more than I would have expected.
I’ve seen:
Plugging unknown systems into critical workflows
Trusting outputs without understanding inputs
Connecting tools with unclear capabilities and unclear vulnerabilities
Removing review because “it’s faster”
That’s not innovation. That’s abdication.
And it’s exactly how “no one meant for this to happen” becomes the most dangerous sentence in the room.
Gray Collar is the antidote to a fully automated society
Here’s the part that doesn’t get said enough:
If a society automates away meaningful work, it doesn’t just lose jobs, it loses structure.
Work is not only income. For millions of people, it’s:
dignity
passion
identity
community
purpose
responsibility
routine
upward mobility
A world where everything is automated doesn’t become a utopia.
It becomes hollow, dark path.
And worse, it becomes even more divided.
Because when automation replaces the middle of the labor market, it doesn’t replace it evenly. It creates a cascade:
The most “repeatable” work gets automated first (This has been happening)
Then the roles that support those jobs get automated (We are seeing this now)
Then the adjacent “middle skill” roles
Then the gap widens until only two categories remain:
highly custom, human-first work (mostly affordable to the top)
fully automated, lowest-cost everything (for everyone else)
That’s how you get not only a wealth divide but a massive labor divide. A highly contrasted K-shape.
A society where human effort becomes a luxury product is a major step backward toward a new kind of aristocracy.
Technology fills gaps, so as a society we have to choose which gaps humans should fill first
Technology has a funny way of moving toward whatever society needs the most.
It fills gaps:
If labor is missing, technology fills it
If cost is too high, technology compresses it
If demand isn’t met, technology scales it
If supply has friction, technology removes it
That’s not a conspiracy, that’s incentives.
So the question becomes:
Do we want technology to fill the gap of “not enough humans,” or do we want to build a system that strengthens humans to meet the needs of society?
That’s what I see for Gray Collar. Not nostalgia. Not rejecting AI.
A workforce strategy designed for the AI era, bolstering a future with humans still at the top.
Gray Collar can help keep the world human while using AI to make it better
We should use AI to remove the admin drag so humans can do the “human work”.
That means building careers (and a culture) that values roles like:
long-term care and caregiving
skilled trades and infrastructure
community operations and support
health, education, and service roles that require trust
And pairing them with technology in a way that upgrades the work instead of replacing it:
AI for scheduling, documentation, routing, safety, and compliance
AI for training and skill-building
AI for matching people to roles where they thrive
AI for reducing burnout and improving success
The goal isn’t to stop the future. It’s to keep it human.
This is not an anti-technology rant. This is not a political debate and shouldn’t be received as such.
It’s a pro-human argument.
It’s a recognition that a functioning society isn’t just built on efficiency, it’s built on active participation. On people having a place, a role, and a contribution that matters.
That’s why I’m leaning as hard as I can into Gray Collar.
Because if we don’t intentionally strengthen the workforce that keeps communities running, especially in care, infrastructure, and essential services, technology will fill those gaps by default. And we may not like the results.
The more it fills, the less room there is for human dignity, community, and identity.
Strangelove was satire, but the warning is real:
The danger isn’t the machine. The danger is humans stepping away from responsibility.
So yes! Let’s build with AI.
But let’s build a world where people still have meaningful work to do, responsibility to carry, and communities to strengthen.
That’s the future worth automating toward.




Of the many insights Brendan offers in this article, this quote resonates most with my 'why.'
"That means building careers (and a culture) that values roles like:
- long-term care and caregiving
- skilled trades and infrastructure
- community operations and support
- health, education, and service roles that require trust"
We have come to respect *uber alles* the roles that make the most money, but MANY of those roles don't add value to society - they extract. Having been a part of several of those careers, I know of what I speak.
I will be very happy if the culture shifts even a little toward valuing roles that add value, dignity, and love to our world.