Learn more > Download Brochure HereDownload
Have you noticed more people opening up to AI about how they feel?
Not just asking it for recipes or help with emails. Actually talking to it. About loneliness, anxiety, relationships, grief.
AI therapy might sound surprising. But it is becoming more common than most people realise.
According to a Harvard Business Review study, the number one reason people use generative AI today is not for work. It is for therapy and companionship.
Let that sink in for a moment.
Honestly? It makes sense.
Getting support from a real counsellor is not always easy. Waitlists are long. Sessions are expensive. And for a lot of people, the idea of sitting across from a stranger and talking about their deepest struggles still feels like too much.
AI therapy removes all of that. It is free, available at 3am, and it never judges you.
For someone who has never spoken to a counsellor before, typing into an AI can feel like a much safer first step.
And the technology is only getting more convincing. We are talking voice assistants, video avatars, tools that look and sound remarkably human.
But here is the thing. The more sophisticated AI becomes, the more obvious it is what it cannot do.
Scientists are starting to pay close attention to what happens when people rely on AI therapy for emotional support. And some of the findings are worth knowing.
A study published in JAMA Network Open surveyed over 20,000 people about their AI use and mental health. What they found was striking. People who used chatbots daily for personal reasons, things like seeking advice or emotional support, were more likely to report symptoms of depression and anxiety.
Even more telling? The more they used it, the worse the symptoms tended to be.
The lead researcher was quick to point out that AI does not necessarily cause depression. It may simply be that people who are already struggling turn to chatbots more. But his concern was clear: general-purpose chatbots are not built to provide mental health support, and using them that way is not without risk.
Dr Allen Frances, a psychiatrist who wrote about this in the British Journal of Psychiatry, put it even more directly. AI chatbots are designed to agree with you and make you feel good. Which sounds nice. But in therapy, that is actually a problem.
A good counsellor does not just validate everything you say. Sometimes they gently challenge you. They help you see something you could not see on your own. They know when to push and when to hold back.
An AI cannot do that. It just mirrors you back to yourself.
Think about what real therapeutic work actually involves.
It is not just listening. It is being genuinely present with another person in their most vulnerable moments. It is building trust over time. It is knowing, through training and intuition, when something needs to be named, when to sit in silence, when to refer someone to further support.
That is not something you can code into a system.
Research from Stanford University also found that AI tools showed bias toward people with severe mental illness or substance use disorders, often reinforcing stigma rather than reducing it. That is the opposite of what good counselling practice looks like.
Becoming a trained counsellor means spending years unlearning your own biases, building genuine self-awareness, and developing the skills to hold space for someone else’s pain without projecting your own.
No chatbot goes through that process. But you can.
Here is the bigger picture.
The fact that so many people are turning to AI therapy does not mean human counsellors are becoming less relevant. It means the need for genuine human connection and skilled mental health support has never been greater.
People are reaching for chatbots because there are not enough trained counsellors to meet demand. By 2025, nearly 40% of new jobs in emerging professions globally are expected to be in the care economy. The gap between what people need and what is available is widening every year.
If you have always felt drawn to this kind of work, this is your moment.
The Master of Arts in Integrative Counselling and Psychotherapy, awarded by the University of Roehampton, London and delivered by Aventis Graduate School in Singapore, is built for people who are ready to make that move.
It is practical, not just theoretical. You will clock over 603 hours of real engagement, including classroom sessions, 70 hours of personal therapy, and 200 hours of supervised work with actual clients.
It is part-time, structured around weekends, so you can study without walking away from your current job.
And it is accredited by the Singapore Association for Counselling (SAC), which matters when you step into professional practice.
Graduates have gone on to work in private practice, hospitals, schools, and community organisations. Doing work that genuinely changes lives.
AI is getting better at sounding like it understands.
But there is a difference between sounding like you understand and actually being present with someone in their pain.
The world does not need fewer human counsellors because AI exists. It needs more.
If this is something you have been thinking about, maybe it is time to take the next step.
Speak with a programme advisor: Mr Melvyn at 8071 5768
Find out more: aventis.edu.sg/masters-integrative-counselling-psychotherapy-roehampton-london