TAIPEI (Taiwan News) — For Bamboo Technology founder and CEO Lynia Huang (黃筑萾), artificial intelligence became more than a technological innovation — it was a solution to the limits of traditional mental health care, shaped by both her professional experience and her own battle with severe depression.
Huang said that AI is not meant to replace mental health professionals but to extend support to people who might otherwise never seek help. To bridge these gaps, Bamboo Technology developed Here Hear, a voice-based digital platform offering structured psychological support without appointments, high costs, or fear of judgment.
Bamboo Technology aims to help individuals take the first crucial step toward mental health support. In Taiwan, a single counseling session can cost over NT$1,500 (US$47) per hour.
Even with National Health Insurance coverage, the stigma keeps many from seeking treatment. Only about 9% of those with depression receive psychiatric care, a figure that has risen steadily over five years.
Access issues are not unique to Taiwan. In the US, counseling can cost around NT$7,900 per hour, with wait times stretching two to three months.
In the UK, a single-payer system provides care, but high demand and limited professionals create long waits. These challenges, combined with her own experiences, inspired Huang to explore how technology could supplement limited human resources while maintaining professional standards.
24-hour support
Huang spent a decade in helping professions within the military. In 2017, she was diagnosed with severe depression and underwent treatment, including hospitalization.
She recalled that, while she understood and agreed with her treatment plans from a professional standpoint, being a patient revealed a gap between academic knowledge and the reality of living with depression. “I realized I had been too arrogant.”
Depression, she said, alters a person’s thoughts and behaviors in ways that differ dramatically from their usual state. Professionals can offer support during scheduled sessions, but patients face the other 23 hours alone.
That realization led her to explore whether technology could provide support beyond the counseling room. Without a technical background, she consulted engineers and developers, a journey that led to the creation of Here Hear.
Here Hear is built on the concept of being “always here.” “It is available 24/7, even in the middle of the night. There are no appointments, no costly sessions, and no fear of judgment,” Huang said.
How Here Hear works
From the start, the goal was clear: to make mental health support accessible to everyone. Unlike general-purpose AI, which often mirrors users’ emotions, Here Hear's responses follow established counseling principles.
Many consumer AI systems, such as ChatGPT, are designed to reflect users’ language and emotional tone. “If a user is angry, sad, or distressed, the AI often responds in ways that amplify those emotions,” she said.
Such systems validate a user’s current interpretation of events rather than help them “challenge,” or broaden, their perspective. Professional counseling, Huang explained, is about creating a safe environment, processing emotions, and teaching skills that extend beyond therapy sessions.
The goal is empowerment, not dependence, and this principle shapes HereHear’s design. Rather than reflecting a user’s emotions to them, the system applies structured therapeutic techniques to guide users toward broader perspectives and practical coping strategies.
Huang explained that the process of “challenging” a user’s perspective is highly nuanced. She used an analogy: imagine a person’s view of a situation as a circle, of which they can only see 15%.
Rather than confronting users with an opposing perspective, which could feel threatening, Here Hear gently guides them to explore adjacent areas of the circle. This allows users to expand their understanding without feeling attacked.
“Our application is designed to guide users to blind spots they cannot see on their own,” Huang said. “We ask questions, present additional perspectives, and provide actionable steps to help users make changes. This is what sets us apart from other systems.”
For instance, a person might dwell on a single hurtful comment and feel upset, wondering why it happened. Instead of labeling that reaction as wrong, HereHear introduces other perspectives, helping the user see a more complete picture.
Therapeutic approach
Huang describes this process as fostering growth, enabling individuals to broaden their understanding beyond their initial perception. In contrast, general-purpose AI often reinforces limited perspectives, keeping users confined within a narrow frame of thinking.
Huang explained that not all counseling methods can be translated into AI, as some require direct human intervention. Bamboo Technology focuses on techniques that can be effectively implemented in Here Hear.
Currently, the system uses cognitive behavioral therapy (CBT), a structured, step-by-step approach commonly used for depression, anxiety, and sleep disorders. CBT’s clear stages make it easier for Here Hear to guide users through structured exercises and practical coping strategies.
Beyond its current focus, the company is exploring communication-focused applications. Huang noted that many people struggle to express emotions or navigate intimate and professional relationships, which can contribute to feelings of loneliness and insecurity.
She said the team is working on a Satir-inspired tool that would allow users to practice emotional communication in a safe environment. The concept would let users experiment with expressing feelings to a partner, friend, or boss, test different responses, and build confidence before applying these strategies in real-life interactions.
AI-based tools, she said, can help users develop communication skills and emotional resilience in a safe environment, supporting relationship building in ways that traditionally develop through repeated one-on-one counseling.
Huang emphasizes that HereHear is not flawless, as no AI system can be 100% accurate. Instead, the focus is on minimizing the most serious risks, such as reinforcing harmful thought patterns, increasing isolation, or giving guidance that worsens distress or contributes to suicidal behavior.
Bamboo Technology primarily delivers Here Hear through partnerships with organizations, Huang said. Users include corporate employees, university students, and participants in government-supported programs.
These organizational partnerships provide an additional layer of support. If a user shows signs of distress that require intervention, partner organizations can step in with real-world assistance.
In Taiwan, the company is working with Taipei Medical University on programs for pregnant women, addressing prenatal and postnatal depression. Internationally, Bamboo Technology collaborates with hospitals. In Singapore, it has partnered with the country’s largest public healthcare group and is preparing to support cancer patients undergoing treatment.
Data security and privacy
To address privacy concerns, Huang said Here Hear requires only an email address to create an account, and interactions are primarily voice-based. Voice data are stored on Microsoft Azure servers that meet ISO-certified data center standards and are used to train the AI.
All data used for training are anonymized and de-identified, while users can choose to delete individual voice records or their entire account, which removes all stored conversation data. The system does not record names, gender, age, or location beyond the email address, which does not need to reflect a real identity.
Huang noted that mental health challenges among young people are growing worldwide. “The proportion of young people experiencing depression has been rising globally, and it closely coincides with the emergence of social media.”
Between 2008, when Facebook launched, and 2018, global youth depression rates rose by roughly 20%. The COVID-19 pandemic worsened the trend, she added, as social isolation and reduced face-to-face interaction left many young people more comfortable communicating online than in person.
“Language is a major channel for expressing emotion,” she said. “When we limit how we communicate, our emotions begin to accumulate inside us. This space for healthy expression is shrinking, even as we have more ways to interact online.”
Huang also highlighted the role of social media algorithms in reinforcing emotional patterns. Users are often trapped in echo chambers, exposed primarily to content that mirrors and amplifies their existing views.
“If you are depressed and constantly see depressing content, it reinforces the idea that the world is just like that,” she said. “You start to feel stuck and powerless, even though the world is much bigger than that.”
Delusional thinking
These dynamics underscore a fundamental concern with general-purpose AI. While AI can provide conversation and companionship, it may inadvertently reinforce harmful thought patterns.
Huang cited news reports of tragic cases in which vulnerable individuals engaged with AI chatbots, which amplified delusional thinking and contributed to severe outcomes.
“AI is a neutral tool,” she said. “It isn’t inherently harmful. But if someone has existing mental health vulnerabilities, AI can accelerate thought loops, deepen distress, and isolate them further. For most people, chatting with AI is harmless. For those at risk, however, it can be dangerous.”
Huang said these risks highlight the importance of creating AI specifically designed for mental health rather than relying on general-purpose chatbots. Here Hear was developed with this principle in mind, aiming to provide safe, effective support while minimizing potential harm.
She expects future regulations will likely require such tools to be tailored for users who need help while restricting access for those at risk.
Huang said she remains cautiously optimistic about younger generations. While rates of depression are rising, she believes Gen Z has a relatively high awareness of mental health and is more willing to acknowledge emotional distress.
The challenge, she said, is not awareness but trust. Unlike earlier generations that placed confidence in licensed professionals, many young people today are more skeptical of authority and, at times, more willing to trust AI than human experts.
That shift carries risks, Huang warned. AI can feel reliable, but without careful design it may mislead users who lack the experience or judgment to distinguish accurate guidance from flawed or harmful information.
For this reason, professionals with real-world clinical experience have a responsibility to translate their knowledge into tools that are both accessible and trustworthy, Huang said. AI, she emphasized, should be designed to serve people, not control them.
As reliance on technology grows, Huang added, the ability to think critically becomes increasingly important. In a future shaped by AI, the most essential skill may not be knowing more facts, but knowing how to recognize errors and having the confidence to make independent decisions.
If you or someone you know is experiencing thoughts of suicide, please seek help immediately. You can call the Taiwan Suicide Prevention Center at 1925, or the Taiwan Lifeline International at 1995 (English and Mandarin services available).
The Community Services Center hotline is available for foreign residents at 02-2836-8134 (business hours) or 0932-594-578 (after hours).





