AI is increasingly becoming a personal advisor... for everything
Originally shared on LinkedIn in June 2025.
AI is increasingly becoming a personal advisor... for everything.
Sam Altman recently said that people in their 20s and 30s are using ChatGPT as a life advisor. And according to Harvard Business Review, people are continuously turning to gen AI for advice and support across some of the most personal areas of life.
More people are claiming to use AI as their:
🧠 Therapist
🏃♂️ Health Coach
💼 Career Advisor
📈 Financial Planner
❤️ Relationship Guide
HBR reports 4 of the top 10 use cases for gen AI are support-related, with “therapy/companionship” now ranked #1, based on online forum data.
And honestly, it makes sense.
AI tools are free (or relatively low-cost), available 24/7, and judgment-free. You can open an app and start talking about your mental health instantly. Many users say AI feels more empathetic and easier to talk to than a human.
And AI adoption is growing fast.
Mary Meeker’s latest BOND report found ~50% of U.S. adults aged 18 - 49 have used ChatGPT.
Cost is another factor. While therapists are known to charge ~$200 per session, ChatGPT’s top tier costs $200/month for unlimited use. And even if affordability wasn't the issue, access is.
Fewer than a third of Americans live in areas with enough mental health providers to meet demand. Most go untreated or receive inadequate care. Globally, the shortage is worse.
Now compare that to AI:
No waitlists. No intake forms. No scheduling dance. No mismatches. No cost barrier. Low friction.
Mark Zuckerberg recently said: “I think everyone should have a therapist... and if they don’t, AI will do that job.”
And on Reddit, one user wrote:
“ChatGPT helped me more than 15 years of therapy.” The post went viral.
Platforms like Character AI let users chat with AI therapists. However, while these tools are widely used, they're also widely debated.
Because the concerns are real.
Can AI read body language? Or recognize a crisis? Critics worry AI may be too agreeable, too comforting, too eager to please. It risks reinforcing biases or giving dangerously bad advice.
And while many tools promise privacy, some people raise a Black Mirror-style concern: What if someone gains access to your AI therapy transcripts and uses them to blackmail you?
In February, the American Psychological Association met with regulators to raise concerns. Their view: AI can help solve the mental health crisis but only if it’s grounded in science, co-developed with experts, and held to strict safety standards.
Some teams are trying to meet that bar.
🔹 Dartmouth’s Therabot delivered a 51% drop in depressive symptoms after 8 weeks in a randomized trial using a custom-built dataset to model therapeutic responses.
🔹 Ash by Slingshot AI is building the world’s first foundation model for psychology, designed not to give advice, but to help people build autonomy, competence, and connection.
We’re watching a profound shift in how people seek support.
Building in this space or seen something powerful? Reach out!