Opinion

Is AI More Human than Humans Are?

By Haniel Andrea V. Mizukami | July 19, 2025

THEY say that “laughter is the best medicine” — but in the Philippines, that laugh can sometimes become poison for those who are living with mental health issues, as they are ridiculed or shamed due to our “resilient culture.” While our generation has embraced the topic of mental health more openly, not all people are the same. A lot of Filipino families would think that if you decide to prioritize your emotional well-being, they would either say that you’re being too sensitive or that you’re not insane to consult any psychiatric professionals. In a country where mental health and therapy are still taboo, some resort to seeking help in artificial means, also known as AI. 

You might be thinking, “AI  nanaman?”  but hear me out.

Growing up, I was taught that being kind is a moral compass and that we should spread kindness without expecting anything in return. However,  at the same time, I have seen these same people make unnecessary remarks at others, invalidating any feelings they may have had. In a society filled with self-righteous people, how does one expect to deal with their emotions? 

As our society continues to progress, we continue to advance and offer more accessibility that caters to people’s needs. For so long, a lot of Filipinos would keep their mouth shut, taking in all the hateful remarks and let their feelings boil beneath the surface. But now, they discovered a new way to seek comfort without any repercussions: seeking support through Artificial Intelligence, such as Character.AI, to talk to different fictional AI characters, and ChatGPT to ask for advice on their problems.  

You might ask, why does this seem harmful when the people who are using this method aren’t even being harmed? Unfortunately, one essential quality humanity falls short of is empathy. 

There were debates on social media, X (formerly Twitter), and Facebook, about whether this new way of seeking comfort and aid causes more harm than good for a person’s mental well-being. Some would say that consulting AI is more sufficient, as they don’t have the capabilities to judge you or ask for payment in return, since therapy is also not affordable for everyone. 

“AI is harmful because it lacks emotional intelligence” is a common rebuttal to why the chatbot is dangerous to use. But if AI is harmful to society, why do people still seek solace? For them, they believe that most people aren't empathetic enough to build a real, meaningful connection, stating that AI has shown more empathy than people themselves. 

One might think that yes, it is unfortunate that as a society, we have let AI chatbots become more welcoming than us humans, and I agree. But at the same time, we should not shame people who delve into artificial ways of help. We continue to argue that accessible therapy is needed for everyone, yet people still manage to cover their ears when one is in dire need of support. If we want people to stop relying on AI for their emotional well-being, change must begin within ourselves. 

One step forward into making therapy more accessible is to normalize mental health services for students, workers, and families alike by giving everyone a safe space here in our country. Therapy does not equate to being crazy or baliw but rather, this also proves that we care enough to regulate our overall well-being. 

AI may replicate empathy disguised in thoughtful, flowery words, but there is one thing it can never replicate—it can never replicate human connection, as we are complex, intricate beings that grow based on how we see, interact, and connect with one another. 

Choose to be kind. Advocating accessible therapy must start within us, as we are humans who were made to interact and shape one another. While AI can offer support, nothing replaces the deep understanding we can give each other as people.