Are We Becoming Machines? How AI Is Threatening the Human Spirit, By Brian Simpson
You've heard the promises: AI will make life easier, free us from work, even raise our kids. Sounds amazing, until you realize what it's really offering: a world where humans are optional. Where struggle, growth, and connection are replaced by convenience, optimisation, and passivity. This isn't progress. This is sedation; replacement. And if we don't fight back, we risk losing not just our jobs, but the essence of what makes us human.
We are told that automation will liberate us, that universal basic income will cover our needs once robots handle all the jobs. But this is not liberation. Work has always been more than a pay cheque. It is where we grapple with responsibility, overcome obstacles, and discover purpose. Strip away the struggle, and you strip away the growth. What sounds like freedom quickly becomes a life void of meaning. Comfort without purpose is not living. It is existing.
AI doesn't just automate tasks; it threatens the process of becoming. Human identity is forged in friction, failure, frustration, adaptation. If we outsource not only labour but judgment, creativity, and learning, we risk fossilising ourselves. We stop becoming; we merely exist as output processors, like the machines we've created.
Perhaps the most chilling trend is the creeping automation of relationships. We see ads of humanoid robots caring for children, acting as teachers, even standing in as doctors. Some call this progress. But care is not a transaction. Connection cannot be mass-produced. The moment we outsource our most intimate bonds to machines, our teaching, our caregiving, our healing, we reduce human life to a set of services on demand.
Machines can simulate kindness, predict needs, even mimic concern, but they cannot suffer, struggle, or grieve. Without the human response to suffering, society risks cultivating empathy deficits. Imagine generations growing up in a world where comfort is guaranteed but moral, emotional, and social courage is never exercised. AI care doesn't teach patience, resilience, or accountability, it merely reproduces patterns.
Through brain-machine interfaces, technologists now imagine linking our thoughts directly to AI. They dream of erasing painful memories, inserting artificial ones, or even merging human consciousness with silicon. But memory is not just data storage, it is identity crystallized through imperfection. Forgetting, misremembering, suffering through flawed recollections, these are the scaffolds of human insight.
If memories can be implanted and rewritten, individuality itself is at risk. Humans become malleable, identity becomes modular, and history becomes a tool wielded by whoever controls the algorithm. A society in which thought is programmable is not a society of free citizens. It is a society of digital slaves.
Even in medicine, regulators increasingly lean on AI simulations to replace real-world clinical trials. This is sold as efficiency, but algorithms are nothing more than reflections of their creators. If the data is biased, the results are biased. If the incentives are corrupt, the "science" becomes corrupted.
Efficiency is not the same as wisdom. Wisdom grows from moral dilemmas, uncertainty, and risk. AI can propose "optimal" solutions, but it cannot understand meaning, nor weigh value in the human sense. Over-reliance on algorithms risks cultivating a society that is technically correct but morally and existentially barren.
AI's champions offer a seductive vision: a future without stress, without effort, without labour. A future where machines do everything and humans do nothing. But underneath that dream lurks a nightmare, the nightmare of becoming irrelevant. The Pixar movie Wall-E should not be a blueprint, but a warning. Human beings, lazy, medicated, isolated, while machines do the thinking, the striving, even the living. That is not utopia. That is anesthesia.
The ultimate irony of AI is that the gravest threat is self-domestication. The more convenient, optimised, and risk-free our lives become, the more passive we grow, the more we internalise the very logic of machines. The Wall-E nightmare isn't machines taking over, it's humans willing to become automated in thought, desire, and action.
The greatest danger posed by AI is not that it will rise up against us, like in the dystopian movies. The real danger is that it will make us surrender too much of ourselves willingly. We are already outsourcing our work to it, our creativity to it, our decision-making to it. Each step makes us a little more passive, a little more dependent, a little less human. Slowly, silently, we begin to resemble the very machines we built.
It doesn't have to be this way. AI could serve as a tool, a partner even, if governed by clear moral, constitutional, and cultural boundaries. But it cannot be allowed to hollow out the human spirit. If a technology undermines free will, erodes responsibility, or automates away the bonds of care and connection, then it should not be adopted. The line must be drawn here.
Retain friction. Pursue tasks that require effort, failure, and personal judgment, even if AI could do them faster. Guard relationships. Invest in human connection that cannot be outsourced, mentorship, caregiving, conversation. Curate knowledge. Treat AI as a tool, not a teacher. Engage critically with ideas; don't allow predictive algorithms to shape thought unilaterally. Embrace imperfection. Celebrate memory, judgment, and creativity as human, fallible, and non-optimisable.
Because what is at stake is not simply the job market or the economy. What is at stake is reality itself. Do we want a future where children are raised by robots? Where memories can be swapped like apps on a phone? Where human beings choose sedation over struggle, and illusion over life?
The risk is not that machines will act like us. It's that we will start acting like them, optimised, efficient, obedient. Machines in human skin. Unless we pause, rethink, and restrain this path now, that is the future being built, day-by-day.
"America was built on the foundational belief that every man is created in the image of God with purpose, responsibility, and the liberty to chart his own course. We were not made to be managed. We were not made to be obsolete. But that is exactly the future Big Tech is building under the banner of Artificial Intelligence (AI). And if we do not slam the brakes right now, we are going to find ourselves in a world where the human experience is not enhanced by technology but erased by it.
Even Elon Musk, who is arguably one of AI's most influential innovators, has warned us about the path we are on. In a sit-down with Israeli Prime Minister Benjamin Netanyahu, he laid out the endgame. AI will lead us to either a future like the Terminator or what he described as Heaven on Earth. But here is the kicker. That so-called heaven looks a lot like Pixar's Wall-E, where human beings become obese, lazy blobs who float around while robots do all the work, all the thinking, and frankly all the living. This may seem like science fiction, but this is what they are actually building.
At last year's We, Robot event, Musk unveiled Tesla's new self-driving robotaxi. But what caught my attention was their preview of Optimus, the AI-powered humanoid robot. In their promotional video, Tesla showed Optimus babysitting children, teaching in schools, and even serving as a doctor. Combine that with Tesla's fully automated Hollywood diner concept, where Optimus is flipping burgers and even working as a waiter and bartender, and you begin to see the real aim. Automation is replacing human connection, service, and care.
So where do humans fit in? That is the terrifying part. Musk and Bill Gates have both pitched the idea of universal basic income to replace traditional employment that AI is going to replace. Musk has said there will come a point where no job is needed. You can have a job if you want one for personal satisfaction, but AI will do everything. Gates has proposed taxing robot labor to fund people who no longer work.
The reality is that work is more than a paycheck. It is not just how we survive; it is how we find purpose. It is how we grow, how we learn, and how we take responsibility. Struggle is not a flaw in the system; it is part of what makes us human. The daily grind, the failures, the perseverance, the sense of accomplishment. Strip all of that away, and you have stripped away humanity.
The problem goes deeper. Through Neuralink, Musk wants to merge the human brain with AI. On The Joe Rogan Experience, he claimed the technology could erase memories and implant new ones. That may sound redemptive for trauma survivors, but in the wrong hands, it is pure dystopia. Governments or corporations with the power to rewrite memory and reshape thought do not create freedom. They create digital slaves.
Meanwhile, the Food and Drug Administration is now authorizing AI-simulated clinical trials for drug and vaccine development. That means fewer real-world trials and more reliance on algorithms. But those models are only as good or biased as the data and programmers behind them. And let us not forget Big Pharma's grip on federal health agencies is well documented. While RFK Jr. and his team may be holding the line now, what happens when a new administration takes over and the revolving door between pharmaceutical companies and regulators swings wide open again?
... These systems can easily reflect the beliefs and intentions of their programmers. And if those programmers work for corporations that answer to shareholders and not citizens, you have a dangerous concentration of power that could surpass even our federal government.
We are not just automating tasks; we are automating thought, decision-making, and identity. We are being sold a future where work, responsibility, and even memory are optional. Where kids are raised by bots. Where real life becomes a simulation. It may sound utopian on paper, but in practice, it is a world where nothing matters because nothing is real.
The Trump administration and every elected official who claims to care about freedom need to hit pause. The partnerships forming between AI developers and government agencies are consolidating control. Big Tech is altering the trajectory of humanity without the consent of the people. That has to stop.
We need a national course correction. AI must be forced to operate within clear ethical, constitutional, and spiritual boundaries. If a technology replaces human labor, undermines autonomy, manipulates biology, or suppresses free will, then it should be rejected outright.
We were not made to be cared for by machines. We were not created for consumption and digital sedation. We were made to work, to struggle, to grow, and to glorify our Creator in the process. The machine cannot give us that. Only real life can.
It is time we defend it before it is gone."
Comments