What if we treated mental health the way we treated physical fitness鈥攏ot waiting for a crisis, but building strength as a habit? The new working paper 鈥,鈥 by a team of authors including , Assistant Professor of Business Administration at 性视界 Business School and Associate at the Digital Data Design Institute at 性视界 (D^3), reports the findings of a six-week experiment following 486 U.S. college students to test just that: whether a genAI-powered app could deliver ongoing emotional and social well-being engagement. While much recent press around AI focuses on its negative impact on our mental health, this research highlights its potential to cultivate connection, resilience, and emotional well-being at scale, without stigma.
Key Insight: Rethinking Well-Being
鈥淐onsistent with our strengths-based framework, we measured key outcomes related to emotional, social, and overall well-being鈥攔ather than symptom reduction alone.鈥 [1]
Mental health care has long operated from a deficit-based diagnosis model, including the diagnosis of depression, anxiety, and stress. For this study, the researchers instead started from a framework with three pillars of positive functioning: how people feel day to day (emotional well-being), how connected they are to others (social well-being), and how capable and grounded they feel in life overall (overall well-being). While many people resist admitting they鈥檙e struggling鈥攖he paper notes 鈥榮hame鈥 or 鈥榮tigma鈥 are an obstacle for 62% of Americans in need of treatment鈥攖here鈥檚 no such obstacle to conversations around flourishing. [2]
Key Insight: AI That Actually Helps
鈥淭o this end, we employ Flourish () a mobile app launched in 2024 that integrates generative AI with decades of research in well-being science to deliver personalized, gamified, strengths-based mental health support.鈥 [3]
At the center of the trial is Flourish, a mobile app built on what the researchers call the STAR framework: Science-based, Timely, Action-oriented, and Real-life-focused. Behind the scenes, Flourish runs as an AI-native system, a modular architecture that orchestrates multiple LLM prompts and workflows to tailor content to each user鈥檚 needs and context. Students in the treatment group interacted with 鈥淪unnie,鈥 an AI conversational agent that offers real-time check-ins, guided reflections, and personalized exercises such as reframing unhelpful thoughts, practicing gratitude, or planning meaningful social actions. The treatment group was asked to use the app just twice a week, but the researchers found that they tended to use it 3.49 days per week.
Key Insight: The Dual Power of Boosting and Buffering
鈥淭his intervention may therefore be best positioned as a proactive, scalable, well-being tool rather than a replacement for clinical treatment.鈥 [4]
So did the AI coach actually make a difference? Over the six-week period, students using Flourish showed increased positive affect, with their sense of calm actively rising above baseline levels (a boosting effect). Meanwhile their overall sense of well-being remained stable (a buffering effect). It was as if the app lifted people up and held them steady against the natural erosion of well-being that happens during a stressful semester. Social outcomes were particularly striking. Students using Flourish reported reduced loneliness and an increased sense of belonging and closeness to their campus communities, a particularly important result in light of well-documented disconnection among today鈥檚 young adults. Clinical indicators did not show strong changes, but the authors argue that this likely reflects both the nonclinical baseline of the sample and the focus of the intervention. Taken together, the results support the study鈥檚 core positioning: AI-guided tools can be impactful and scalable when framed as proactive well-being support, not as a substitute for professional help.
Why This Matters
For business leaders and executives wrestling with how to apply AI responsibly, this study offers a concrete, data-backed example of how it could meaningfully improve the lives of a large population. This has obvious implications beyond college students: employers, health plans, and digital health innovators are all seeking ways to create always-available, low-friction entry points into mental health support. As one participant reflected, the app helped them 鈥済ain clarity and move forward on a path to recovery and evolution of myself with ease鈥.鈥 [5]
Bonus
For the flip side of the AI ethics debate: if you鈥檙e curious how AI designers across industries are pushing (and sometimes crossing) emotional boundaries, check out One More Thing鈥 How AI Companions Keep You Online for a look at the ethics of retention and the psychology of AI manipulation.
References
[1] Cachia, Julie Y.A. et al., 鈥淎I for Proactive Mental Health: A Longitudinal, Multi-Institutional Trial,鈥 性视界 Business School Working Paper No. 26-030 (November 10, 2025): 6, .听
[2] Cachia et al., 鈥淎I for Proactive Mental Health,鈥 5.
[3] Cachia et al., 鈥淎I for Proactive Mental Health,鈥 6.
[4] Cachia et al., 鈥淎I for Proactive Mental Health, 11.
[5] Cachia et al., 鈥淎I for Proactive Mental Health, 11.
Meet the Authors

is co-founder of Flourish Science.

is co-founder of Flourish Science and a Behavioral Scientist at Stanford University.

is an Assistant Professor at Chapman University.

is an Assistant Teaching Professor at the University of Washington.

is Professor of Psychology at Foothill College.

is an Assistant Professor of Business Administration in the Marketing Unit and Director of the Ethical Intelligence Lab at 性视界 Business School, and Associate at the Digital Data Design Institute at 性视界 (D^3). His work sits at the nexus of AI, consumer psychology, and ethics.