Earkick
Free app for iPhone
← Back to Research

Ethical AI for Mental Health: Earkick's Perspective in 2025

K. A. Stephan, H. Bay2025Whitepaper

Abstract

An in-depth look at how Earkick’s ethical approach to mental health tools supports fair access, transparent use of data, and honest conversations about emotional challenges with a free online chat option for employees.

1 AI and Mental Health: What We Can’t Afford to Get Wrong

In 2025, AI for mental health has moved from niche to necessary. It’s no longer a fringe experiment or speculative tech demo. AI in mental healthcare is a mainstream reality.

According to the Mental Health in an AI World report, 41% of adults have already turned to AI chat agents for emotional or psychological support. Among people with a diagnosed mental health condition, that number rises to 48%.

However, more than 80% of those conversations happen on general-purpose platforms like ChatGPT or Gemini, tools never intended for clinical or emotional intervention. Only 18.5% take place on purpose-built platforms or artificial intelligence mental health apps that are grounded in psychology and built with ethical guardrails.

This mismatch has real consequences. In early 2024, the BBC spotlighted Character.ai, a platform where more than 20 million monthly active users worldwide and 18 million unique chatbots interact with each other. Especially younger people, engage in emotionally charged conversations with AI personas ranging from fictional characters to makeshift therapists. While some users described the experience as comforting or even life-changing, the coverage also underscored a critical reality: these bots, though engaging, are unregulated and built primarily for entertainment.

By 2025, that same platform was under public and legal scrutiny following a tragic teen-suicide case involving one of its AI chatbots. An urgent debate around responsibility, safety, and the ethical limits of generative AI has gained momentum since.

2 Current Trends in Mental Health

One of the current trends in mental health is that millions of people are now having vulnerable, emotionally charged conversations with AI systems. And they’re doing so in the absence of better alternatives. Especially among younger generations, mental health support is shifting from couches to screens. Gen Z, more digitally native and stigma-aware, is turning to AI mental health chatbots as a first stop, not a last resort.

Meanwhile, global mental health care systems are at a breaking point. Waitlists are ballooning: in parts of the UK and Ireland, patients wait 6 to 18 months for therapy appointments. In the U.S., many wait weeks to months, especially in underserved areas. Costs are rising, and even where care is available, stigma, time constraints, and affordability keep many from reaching out.

And not everyone wants to talk to a therapist—at least not first. A 2024 McKinsey study showed that over 60% of Gen Z report symptoms of anxiety or depression, but only a fraction seek traditional care. Instead, they’re looking for emotional tools that are private, accessible, and available on demand.

Even if the world had enough mental health professionals to meet the staggering need—and by WHO estimates, it doesn’t—the system would still fall short. Work-related stress and burnout have become near-universal, with over 70% of workers reporting emotional exhaustion. Yet crisis hotlines and therapy systems remain fragmented, overloaded, and hard to reach when you need them most.

At the same time, the meaning of “mental health” is expanding. It’s no longer just about clinical disorders: It’s about emotional fitness, daily regulation, energy levels, purpose, and performance. AI tools are increasingly part of that lifestyle shift, helping people manage the complexities of daily life with more clarity, not just crises.

Earkick steps into this moment intentionally, offering not just another AI chatbot, but a purpose-built, privacy-first companion grounded in real psychology and designed to support human resilience, not replace it. Where most tools chase engagement, Earkick prioritizes psychological safety, instant access, and emotional truth.

In 2024, gen AI was for brainstorming. By 2025, it’s become a go-to for emotional support, life planning, and even searching for purpose. And unlike general-purpose models, Earkick was built to guide, not to guess.

3 From Curiosity to Crisis — Why Users Turn to AI First

That shift from utility to intimacy has redefined how and why people engage with AI in the first place. What started as curiosity—asking an AI chatbot for advice on sleep, stress, or relationships—has rapidly evolved into a new kind of frontline support. According to the Mental Health in an AI World report, 63% of users tried AI for mental health simply “to see what it could do.” But those who stayed found something deeper: 81% said the 24/7 availability was its most helpful trait, while 27% were drawn in by the promise of anonymity. Among people already in therapy, nearly half (43%) use AI chat support in parallel to their human sessions.

These conversations are door-openers, unlocking a level of honesty that often feels too risky with another person. People are offloading real emotions—stress, loneliness, self-doubt, even trauma—onto AI systems. Many users report feeling more comfortable opening up to a chatbot than to another human being. In fact, 46% say they’ve shared things with an AI mental health chatbot that they’ve never shared with a therapist.

But this trust is fragile.

When the technology behind those conversations is designed for entertainment or maximizing screentime, the intent is not to help users get better. When it isn’t governed by psychological frameworks, users risk receiving irrelevant, inappropriate, or even harmful responses. So, the challenge isn’t whether people will use AI for emotional support. They already are. The question is whether the tools they turn to are built to help them achieve healthy goals.

4 The Earkick Blueprint: Purpose-Built AI for Therapy, Not Entertainment

At a time when most emotional conversations with AI are still happening on general-purpose models like ChatGPT, Earkick took a different path: build a domain-specific AI for mental health from the ground up with real-world psychology at its core, and ethical safeguards at every layer.

Here’s how that blueprint comes to life:

1. Privacy-first architecture

Earkick’s mental health AI chatbot, Panda, doesn’t require registration. No account. No login. No personal identifiers. All interactions are encrypted using AES-256 and processed securely, with an option for fully offline mode on future versions. This means people can engage in vulnerable emotional conversations without ever worrying about their data being stored, sold, or misused.

2. Evidence-based, therapy-aligned conversations

Earkick’s guidance engine is rooted in clinically validated approaches like Cognitive Behavioral Therapy (CBT) and Dialectical Behavior Therapy (DBT). It mirrors structures a therapist might use: mood check-ins, cognitive reframing, emotion labeling, distress tolerance. This makes it a reliable companion for navigating everything from daily stress to deeper emotional cycles while never pretending to be a replacement for licensed therapy.

Where most tools optimize for engagement, Earkick optimizes for trust. And that trust starts with building an AI that understands its limits, respects human vulnerability, and stays grounded in science.

As demand for accessible care grows rapidly and worldwide, mental health chatbots and tools providing AI psychotherapy or AI counseling are becoming the new normal. In fact, they’ve been shaping the future of mental health technology since the Pandemic, but clinical trials on the effectiveness of generative AI therapy tool such as Therabot have just recently emerged.
What will set the next generation of tools apart, is their ability to combine real-time tracking and AI emotional support with structured, therapy-aligned methods.

5 Technology Deep-Dive: Multi-Modal Insights at Scale

Unlike tools that rely solely on typed input, Earkick's technology is multi-modal by design. It can process voice, text, and video check-ins. It also integrates with wearables to detect subtle biometric shifts like movement or sleep changes that often precede emotional downturns. Even factors such as weather is taken into consideration.

This layered sensing unlocks a richer picture of mental health. By analyzing tone, pace, sentiment, and behavioral patterns over time, Earkick’s AI personalizes support in real-time. If your voice tightens or your sleep worsens, it tracks the change and adapts its guidance.

Behind the scenes, big data analytics and AI in mental healthcare make this possible. The system learns which interventions help stabilize anxiety, when to suggest a breathing exercise, or when to surface motivational content instead of cognitive reframing. Rather than just reacting, it anticipates trends, all while preserving privacy through techniques like differential privacy and on-device inference.

To be of true value, artificial intelligence therapy has to move beyond generic advice and toward deeply adaptive support.

6 What 250,000+ Users Tell Us

Since its launch, Earkick has quietly become one of the most trusted AI companions for mental wellness. With over 250,000 users and an average 4.8-star rating in the app stores, the Panda-shaped buddy is downloaded, praised in reviews, and increasingly returned to.

On average, active users engage five times per week. Many users speak to Panda several times per day. That’s more frequently than most people see a therapist in a month. And the sessions are used in a broad range of situations. From managing work stress to breaking cycles of panic attacks, users describe Panda as a grounding force in their daily lives.

Compared to the broader AI mental health chatbot market where 61% of people only use a tool a few times per month and satisfaction hovers around 79% Earkick stands out. The combination of trust, privacy, and psychological alignment appears to drive deeper, more sustained engagement.

And while Earkick makes no claim to replace therapy, many users report that Panda helps them stay afloat between sessions or build the confidence to seek human help for the first time.

That’s the real win: Instead of replacing human connection, Earkick restores access to it.

References

  1. Mental Health in an AI World Report, Hemingway/Redburn, 2025.

  2. How People Are Really Using Gen AI in 2025, Harvard Business Review, 2025

  3. APA (American Psychological Association). Workforce Access & Shortages Report. 2024.

  4. McKinsey Health Institute. Gen Z and the Mental Health Gap. August 2024.

  5. World Health Organization. Mental Health Atlas 2023. Geneva: WHO Press.

  6. World Health Organization. “Depression and Other Common Mental Disorders.” Fact Sheet, 2023.

  7. Gallup. State of the Global Workplace Report. 2023.

  8. Deloitte Insights. 2024 Global Human Capital Trends: The Worker Wellbeing Agenda.

Earkick KI Therapist ist keine approbierter Psychologin oder Psychiaterin und kein Ersatz für professionelle psychische Gesundheitsversorgung. Die App dient ausschließlich zu Bildungszwecken und zur persönlichen Weiterentwicklung. Wenn du dich in einer psychischen Krise befindest oder professionelle Hilfe benötigst, wende dich bitte an eine qualifizierte Fachperson im Gesundheitswesen.