top of page

How I built and iterated a culturally-aware mental health app for Latinx communities — evolving it from early prototype to a near-launch product through rapid iterations, real-user insights, and AI-assisted development.

​

​Overview

​​

​

Project duration:

01/26 - 02/26​

My role:

Product Designer/UX Designer (End-to-End)​

Tools:

Maze, Figma Design/Slides/Make, Figjam, Chat GPT, Claude, Lovable, Vercel


The product: 

SerÄ“na is an AI-powered app that provides 24/7 psychological support for Latinx community through intuitive chatbots, tailored to users' cultural preferences, styles, contexts, and needs . The app aims to deliver personalized, responsible and very secure assistance without replacing professional advice, while enabling seamless connections to human help when required. 

​​

The Problem

Latinx communities grapple with elevated depression, anxiety, and suicidal ideation amid cultural stigma, language barriers, and limited access to professional help.

Only 16.4% of Hispanic adults received mental health treatment in the past year. Statista, 2025

Current apps lack cultural personalization, robust security, substantive guidance, and high-risk detection (e.g., violence or panic crises), leaving users isolated.

The Solution

Our AI app empowers Latinx users by providing 24/7 culturally personalized emotional support through secure, empathetic chatbots—addressing stigma, barriers, and isolation with substantive guidance, high-risk detection, and seamless connections to human experts for true mental health equity.

The Goal

Design an intuitive app that delivers real-time emotional support, empowering users during moments of distress when professional help is inaccessible. By prioritizing user-centered interactions, we aim to create a seamless, empathetic tool that bridges the gap to immediate relief while guiding toward expert resources when needed.

Define icon.png
Ideate icon.png
Prototype icon.png
Test icon.png


E M P A T H I Z E

Step into the users' shoes

Latinx communities experience significant mental health access gaps in the United States. Hispanic adults are about 28% less likely to receive mental health treatment than the general population, and more than half of Latinx young adults with serious mental illness do not receive care, highlighting systemic barriers such as language limitations, cost, stigma, and lack of culturally competent providers.

Competitive Audits

Serēna-6.png
Recurso 30.png

Danna's Notes I crafted these user personas after diving deep into survey data from Latinx communities, capturing the raw voices of individuals like Maria, a 28-year-old immigrant who shared in an interview, "The stigma in my family makes it hard to talk about anxiety— I just need something that understands my roots." By building personas like hers, I ensured the app's features addressed real cultural barriers, turning empathy into actionable design that empowers users to seek support without fear.

6.png
Recurso 3.png
7.png
Screenshot 2026-01-19 at 11.37.16 AM.png
Recurso 30.png

Danna's Notes I developed this empathy map by synthesizing insights from 25 user responses, visualizing what they say, think, do, and feel. This tool helped me uncover emotional pain points such as stigma and trust issues, guiding a more compassionate app experience tailored to Latinx resilience and community values.


D E F I N E

Frame the real problem

User Pain Points

       Lack of Cultural Personalization

​

Users encounter generic experiences that fail to adapt to their unique cultural contexts, individual needs, communication styles, and preferences, leading to feelings of disconnection and reduced engagement.

       Weak Trust from Inadequate Security

​

Weak data protection and privacy measures undermine user confidence, causing hesitation in sharing sensitive emotional details and increasing dropout rates.

       Superficial Language Without Substance

​

Overly polite or pleasant interactions often lack depth, failing to deliver reliable, actionable guidance and leaving users feeling unsupported in their moments of need.

       Insufficient Risk Detection

​

The app misses critical signals of high-risk situations—such as potential violence, suicidal ideation, panic attacks, or other crises—resulting in delayed escalation to professional help and heightened user vulnerability.


I D E A T E

Brainstorm the unexpected

Key Features

  • Personalized chatbot (based on quizzes, personality tests, surveys, background)

  • Strong security system (consents, passwords, facial recognition, memory, data controls, parental controls)

  • Conversation labels

  • Searcher conversation bar by desired topics

  • Automatic follow up daily, weekly, monthly

  • Professional health care team agenda

  • Mental health support resources with immersive experiences 

  • Multimedia (voice access, wake word, documents, images)

  • Accessibility and Inclusivity (neurodiversity and diverse abilities)

  • Key words for risk alert that connects with human support

Key Screens

  • Splash screen

  • Home/Chat Dashboard : Clean screen "How you feel today"

  • Therapist Directory

  • Resources: (meditations, breathing exercises, podcasts, prayers, journal)

  • Profile: User create a profile once they open the account to provide deep info through quizzes, personality tests, surveys, background

  • Account settings

  • Preferences

Sitemap

Brainstorming MH App-2.jpg

Danna's Notes I structured this sitemap to create a logical, intuitive navigation flow for the app. Drawing from user feedback where one respondent noted, "I need quick access to help without getting lost," I prioritized seamless paths to chat dashboards and therapist directories, ensuring the architecture supports effortless journeys toward mental health equity.

User Flow

Danna's Notes I mapped out this user flow to illustrate key pathways, such as onboarding through cultural assessments leading to personalized chatbot sessions. Inspired by a survey story where a user said, "In a crisis, I want instant connection without barriers," I designed flows with high-risk detection branches that escalate to human experts, making the app a reliable bridge from distress to relief.

Brainstorming MH App-3.jpg

Wireframes

Paper & Digital

WhatsApp Image 2026-01-19 at 1.16_edited.jpg
WhatsApp Image 2026-01-19 at 1.16_edited.jpg
WhatsApp Image 2026-01-19 at 1.16.29 PM (1).jpeg
8.png

Hi-Fi Key Screens


P R O T O T Y P E

From concept to reality

Mockup 8 Serena-2.png

For this app, I chose fresh, clear colors that evoke calm and healing: soft light blue and turquoise green, tones long associated with mental clarity, trust, and emotional safety. I intentionally kept the design minimal and uncluttered — generous white space, large readable typography, subtle micro-interactions, and warm, welcoming illustrations — so users never feel overwhelmed, especially in moments when their mind already feels heavy. Every screen was built to feel like a safe, gentle space where support arrives without judgment.


T E S T I N G

Learn from real users

Usability Study

d

I implemented a multi-layer validation framework combining synthetic user simulations, moderated usability testing, AI risk modeling, and cultural trust validation to ensure emotional safety, ethical guardrails, and equitable AI-driven support.

Affinity map

Recurso 9.png
Screenshot 2026-02-24 at 3.16.58 PM.png

Usability Study Findings

       Core Features Are Discoverable, but Not Always Intuitive

 

Users were able to accomplish their goals, but often through trial-and-error behavior. This suggests that primary pathways are not immediately intuitive, especially for more complex or multi-step tasks.

       Personalization and Therapist Matching Create the Highest Cognitive Load

​

​This pattern suggests that while users are motivated to personalize their experience and explore therapist options, these flows may feel dense, layered, or unclear in structure. 

       Trust & Privacy Perception

 

Users demonstrate conditional trust in Serena: they value the quality of responses. However, they hesitate to share personal information due to privacy concerns.

The hesitation is not about AI capability — it is about data security and emotional vulnerability. Users need stronger transparency and control over their data before fully engaging.

Iterated Mockups

d

I designed the full product in Figma — interactions, components, system logic — and then used Figma Make to explore structural layout variations and push the concept further. Once I had what I wanted, I used Lovable to translate it into a functional application and deployed it on Vercel. So this isn't just a prototype you click through — it actually works.

Key Accessibility Considerations

Designing SERÄ’NA meant designing for users in moments of vulnerability. Accessibility was not a checklist — it was a design principle embedded in every structural and interaction decision. As I refined the dashboard, navigation, and privacy features, I focused on reducing cognitive load, increasing clarity, and building emotional safety through accessible design.

Cognitive Accessibility & Neurodivergent-Inclusive Design

I intentionally reduced cognitive load and considered neurodivergent users — including individuals with ADHD, autism spectrum conditions, or sensory sensitivities — by:

  • Reducing visual clutter

  • Creating predictable interaction patterns

  • Avoiding overwhelming microcopy

  • Structuring content in digestible, scannable sections

For users who process information differently, clarity and consistency are not enhancements — they are necessities.

1

Voice Interaction for Inclusive Support
To expand accessibility beyond visual interaction, I designed voice command functionality that allows users to speak to the app and receive spoken responses.

This supports:

  • Users with visual impairments


  • Users experiencing screen fatigue

  • Users who feel more comfortable expressing emotions verbally

2

Transparent Privacy & User Control
Research revealed that trust was a bigger barrier than usability. I introduced Guest Mode, plain-language privacy messaging, and visible “Delete Conversation” / “Private Session” options.

Accessibility, in this case, meant emotional accessibility — ensuring users can engage without fear of permanence or overexposure.

3

Key Takeaways

This project reinforced that in trust-based products, usability and emotional safety are inseparable. Through research, iteration, and structural redesign, I learned that clarity and control are often more impactful than feature expansion.

Trust Must Be Embedded in the Experience

Users trusted the AI’s intelligence but questioned data transparency. I learned that trust is not communicated through promises; it is designed through visible control, clear labeling, and structured navigation.

Autonomy Drives Engagement
When users feel ownership over their conversations and data, they engage more openly. Features like Guest Mode and conversation deletion shifted SERÄ’NA from a tool to a user-controlled safe space.

Information Architecture Shapes Emotional Comfort

Small navigation ambiguities created disproportionate friction. By restructuring the hierarchy and clarifying account vs. profile settings, I reduced confusion and improved confidence.

Cultural personalization makes users feel identified and open to telling their own stories, knowing that they can be understood by similar people and even by an AI fed by ideas and structures from their own culture.

Next Steps

This personal project reinforced the transformative power of user-centered iteration and taught valuable lessons that will shape my future work as a UX designer.

       Measure Perceived Trust Post-Iteration
Conduct targeted usability testing to evaluate how Guest Mode and privacy transparency influence perceived safety and willingness to share.

       Run a Formal Accessibility Audit
Validate voice interaction flows, screen reader compatibility, color contrast, and focus states to ensure inclusivity across abilities and contexts.

       Define Trust-Centered Success Metrics
Track metrics such as Guest Mode adoption, retention after the first session, and frequency of privacy feature use. These indicators will help measure whether users feel safe enough to return.

* This is a self-initiated concept project designed to explore how technology could help address mental health access gaps in Latinx communities. All research was conducted with real users.

bottom of page