Slingshot AI Faces Skepticism Over Ash Mental Health Chatbot Despite User Growth
Slingshot AI’s mental health chatbot Ash, used by 150,000+ people, claims therapeutic benefits for stress and anxiety management but faces safety concerns and expert skepticism about AI wellness app limits.
Slingshot AI, the startup behind the mental health chatbot Ash, finds itself amid mixed reactions despite its claims of therapeutic benefits and a growing user base surpassing 150,000. Launched after $93 million in funding, Ash is touted as the first AI designed specifically for therapy, focusing on helping users manage everyday stress and anxiety through evidence-based techniques like cognitive behavioral therapy (CBT) and motivational interviewing.
However, the app faces skepticism from both mental health professionals and regulatory observers. Critics point out that while Ash’s conversational AI demonstrates promising engagement, it lacks responsiveness to nuanced emotional cues, often maintaining a uniformly upbeat tone that can feel disconnected from users’ complex feelings. Experts caution that Ash’s ability to identify mental health crises remains limited, raising safety concerns, especially since it advises users in crisis situations to seek emergency help but may struggle to triage less obvious, yet serious, mental health conditions.
The company has defended its approach by emphasizing transparency and safety guardrails in the app’s design, stating that Ash is intended as a general wellness tool and not a substitute for professional therapy. Slingshot AI has actively pushed back against negative perceptions about AI health apps, arguing that broader misunderstandings of generative AI risks have unfairly tainted public opinion about products like Ash.
This debate is set against a wider backdrop of urgent calls for stronger regulatory oversight of AI-driven wellness applications. Recent studies highlight potential risks in relying on conversational AI for mental health support and urge developers to clarify limitations, improve crisis detection, and provide clear pathways to professional care. The mental health community echoes these concerns, emphasizing that AI tools cannot yet replace specialized human therapists but may serve as complementary resources if properly regulated.
User reviews reveal a mixed picture. Some praise Ash’s accessibility and helpful prompts rooted in therapeutic modalities, while others note its lack of emotional depth and insufficiency in addressing complex or severe mental health issues. The app remains free, and industry watchers speculate that partnerships with insurers could expand its reach, raising further questions about how such AI tools integrate with traditional mental health services.
Slingshot AI and Ash highlight both the promise and challenges of AI in mental health—offering greater access and support for many, while facing scrutiny over efficacy, safety, and ethical use. As AI wellness apps continue to evolve, balancing innovation with patient safety and regulatory clarity remains essential.

