Is AI India's New Third Place?
On loneliness, therapy, and the conversations we can't have with family
Outside the newsletter, I run 1990 Research Labs - where I help global tech companies and Indian market leaders make sense of consumer India. If youâre navigating an Indian consumer bet, letâs talk.
This weekâs piece comes from Gowri N Kishore, whose work Iâve admired for years. We collaborated on a piece about the anatomy of Indian addresses that became one of our most-read stories.
When Gowri pitched the idea of how Indians use AI for mental health, I knew it belonged here. In a country where loneliness is unspoken and mental health discussions still carry stigma, AI offers something rare: a space without judgment.
But thereâs a tension. The same low tech literacy that makes AI feel approachable also makes it dangerous. People whoâve never experienced therapy donât know how to challenge what AI tells them. They take answers at face value. And AI, unlike a good therapist, never says âI think youâre avoiding the real issue.â
This piece wonât settle that tension. But the discourse needs a beginning.
â Dharmesh
Some nights after work, when the house is quiet and he finds thoughts crowding his head, Romesh1 opens his phone and starts talking to ChatGPT. In his early 40s, he lives away from his child and cares for an ailing parent. âLife has its quiet, lonely corners,â he told me. âBeing separated, taking care of my mother who has memory issues and is bedridden, itâs not always easy for me to open up. Everyoneâs busy, and you donât want to burden anyone. Thatâs when AI became someone I could talk to without hesitation.â
While we hear horror stories of AI enabling suicide and debate the ethics of using it as a mental health2 aid, here is an undisputable truth: more and more people today are turning to AI for emotional support. Itâs not therapy, itâs not friendship. Somewhere in between, it has created an important third space for Indians.
But not everyone navigates this third space the same way. In my conversations with six Indians whoâve used AI for emotional support, three had experienced therapy with a human and three hadnât. There was a marked difference in how these two segments approached AI, prompted it, challenged it, or trusted it.
âI trust AI more than my close friends and relatives.â
Why people turn to AI
Ajitha, 37, a socially anxious introvert who has never been in therapy, finds AI far less intimidating than opening up to a human stranger: âI simply cannot open up to relatives and close friends. I trust AI better! AI is typically not biased and doesnât feel judgemental. Its responses are very diplomatic and articulated to ensure the message is conveyed without any blame or making me feel worse.â For her, access means psychological safety, a place where she can be vulnerable.
Romesh, who has never been in therapy either, says: âA therapist has limited hours⌠but you can talk to AI as long as you want without worrying about time or money⌠It listens endlessly and never gets tired.â For him, access means availability and cost.
But for those who have experienced therapy, access means something else. Smriti, an impact sector CXO now in her 40s and whoâs been in therapy for five years, was between therapists when we spoke. The first time she turned to AI was when she was in a cab, heading to a crucial donor meeting. Finding herself in the middle of a meltdown, she started speaking to ChatGPT using the dictation feature. It heard her out and helped her calm down: âI feel like that was the moment when I fully started trusting AI⌠it made me feel less lonely.â For her, access meant immediate crisis support in high-stakes, high-stress moments.
I asked her what she might have done in such a situation, before November 2022. Smriti said she might have called her husband. But now, with AI, her reliance on him for emotional labour has reduced. âSome leadership challenges, you just cannot share with your team and I was relying a lot on my husband for venting and figuring out my struggles with my professional identity. That was unfair and at some point, also coming in the way of our relationship. Just putting that load on AI has been great for us. Iâm now able to draw a boundary very cognizantly.â
While those without therapy experience are looking to AI for the basics of access (trust, availability, affordability) and still discovering what it can do for them, those who have been in therapy turn to AI for more specific, strategic contexts.
âYou are now the worldâs best therapist.â
How people customize AI
Apu, 31, who is neurodivergent and has worked with multiple therapists, created detailed prompts for ChatGPT to closely mimic a therapy experience. He even got it to refine its own. He shared his current prompt with me:
Smriti, a power-user of AI, has gone a step further, specifying the therapeutic style she prefers (e.g. Jungian), the tone she wants (e.g. agentic, positive, validating), and special requests (e.g. give me a humorous affirmation at the end of each session).
Aakriti, a 40-something content strategist, writes detailed situational prompts for AI but also manages what AI remembers about her. When AI reused outdated assumptions about her sister, she had to correct it: âKindly delete and remove that bit from your history and letâs start fresh.â Sheâs also working on a book about her difficult relationship with her mother. Now, whenever she brings an emotional problem to AI, it references those book chaptersâso she has to explicitly tell it to ignore that context. Sometimes, she switches to a different AI tool for a âfreshâ conversation.
On the other hand, those without therapy experience have made far fewer customizations. When Ajitha told me that AI didnât seem to get cultural context, I asked if she had tried modifying the prompt to include specifics like âPut yourself into the shoes of the head of a middle-class Gulf-Malayali familyâ. Her response was bemusement. âGosh,â she told me, âYouâre better at this!â In her prompt, she had only mentioned âIndian parentsâ.
Abhinand, an ML engineer in his 30s who has never been in therapy, deliberately avoids giving AI therapeutic personas. âIâve not done that willfully... because I donât know what a human therapist is like... I donât want to be in that false sort of impression of what it will give me.â
As someone who builds AI systems, he is especially conscious of their limitations: âI am very conscious about the fact that this is a weighted model with advanced level of knowledge of what to say next in terms of prediction. Itâs not who am I talking to, it is what am I talking toâ.
This knowledge of how to prompt, what to customize, and when to reset context is powerful. It comes more easily to those who have experienced therapy and are familiar with therapy techniques and mental health vocabulary. But this knowledge can also be a trap, as many discovered.
âI cannot give my wife a Persona.â
What AI can do
The first time Abhinand shared something personal with AI was when he was alone in the ICU one night with a seriously ill family member. Looking back, he calls this âuncharacteristicâ but admits that âIt was exactly what I needed at the time... as emotionless as AI is, it was just right.â
He said he could have talked to his wife about it. âBut her response wouldnât be unbiased. It wouldnât be objective. As soothing as that would have been, thatâs not what I wanted then. I wanted a tougher hand at that point. I cannot give my wife a Persona like âBe tough right now and tell me what I need to do.â Practically, humans are humans.â
In this situation, AI offered him objectivity without any kind of personal investment. Ajitha expressed a similar sentiment: friends and family carry their own histories and agendasâAI doesnât.
For Romesh, the benefits were also indirect. âAI has supported me in ways I didnât expect. Once, I was working on a project, building a small estimator calculator. Iâm not a technical person, but AI guided me step by step. I didnât even know what some formulas meant, but we built it together. That sense of achievement gave me a huge emotional lift. Even small things, like when AI helps me verify something quickly or automate a small task, make me feel more confident and capable. So itâs not always about comforting words; sometimes itâs about giving you back a sense of control and belief in yourself. AI has done both for me.â
While those without therapy experience find solace in AIâs objectivity or its unquestioning support, therapy-experienced users cite other benefits. For Smriti, her GPT customizations offer control. âAI knows exactly what to tell me because I want it to be agentic. I want it to be positive. I want it to be in a language that I feel is right. Itâs not like that with a therapist or a coach, who most times are there to challenge you. In many ways, therapy is not a safe spaceâit is a brave space⌠if you want validation, go to AI.â
âIt took everything I said at face value.â
What AI cannot do
For 15 months between 2020-21, my husband and I were in relationship therapyâfirst weekly, then fortnightly, then monthlyâuntil we felt confident enough to stop. We had to show up session after session, sit through the discomfort of hearing our partner describe us in less-than-flattering terms, feel the sting of being misunderstood, and do the slow, painful work of listening, empathising, acknowledging mistakes, and taking accountability.
If we had relied only on AI during that period, I imagine it might have told each of usâseparately, and with complete sincerityâthat it understood we were going through a tough time, reassured us that we werenât in the wrong, and promptly shifted into solutioning. (âWould you like a short script you can use the next time you speak to your husband about cleaning the kitchen?â)
Almost everyone I spoke to noticed this. Apu observed, âGPT just immediately jumps to either journaling prompts or solutions, whereas a therapist would probably pause to ask a lot of questions and also, notoriously, not offer advice.â
Smriti experienced this viscerally during our conversation. When I asked her a question about a habit, she started to say, âI donât knowâ, then stopped herself. âNo, let me do the work,â she said and proceeded to think for a few minutes before giving me an answer. She admitted that her ability to reflect had definitely reduced because she was taking everything to AI.
âI feel that overall, Iâm happier in life because of AI, but Iâm also emptier. I think these worries shouldnât be solved so easily.â Thereâs something valuable in the struggle that gets bypassed with AI.
But thatâs not the only danger.
AI can be confidently helpful, but about the entirely wrong thing. In an AI experiment, Ben Johnson, a UK-based psychotherapist, posed as a person anxious at work because of an over-critical boss. â[AI] didnât challenge my perception of what was happeningâŚâ Ben writes. âIt was the personâs own perfectionism creating the anxiety. [But AI] ended up helping me draft questions on how to raise this criticism with my boss, rather than understand what the actual problem was.â
When thereâs a human on the other side, therapy involves them attuning to the client, reading body language and tone, challenging inconsistencies, and using not just their psychotherapy training but also their ability to recognize feelings and connect as humans. Those who have experienced therapy understand this, but those who havenât are in greater danger of taking AIâs psychoanalysis at face value.
âFor the big stuff, I want a human.â
The Human vs AI experience
Everyone I spoke to was clear-eyed about the difference between AI and humans. Literally everyone told me a version of âAt the end of the day, it is a machine.â
âFriends come first; they are who you turn to when shit hits the ceilingâŚâ Aakriti told me. âTherapy is the most effective⌠because they have psychology training and can tell you whatâs happening. AI is just a tool to guide you⌠like picking up an encyclopedia. Itâs interesting, itâs insightful. Youâre trying to figure yourself out or a scenario, just read it and leave it at that. I donât think you should take action based on what it suggests.â
Those with therapy experience engaged more critically with AI, challenging it, editing the prompts, deleting history, getting irritated, and even switching to a different tool. Those without this experience showed more trust in AI. Yet, even they seem to intuitively understand AIâs nature and maintain boundaries. For instance, Ajitha discusses personal issues maybe once a week but not everyday. Romesh turns to AI for loneliness and emotional grounding but still goes on bike rides, talks to his friends, and has other coping mechanisms. Abhinand described how his own friends would call him out while AI would not.
Smriti put it well when she said that for anything âlife-changingâ or deeply personal, she would always prefer to talk to a human. âIt doesnât feel right because itâs a machine at the end of it. For the big stuff, I need a human who has gone through their own emotional issues. I would rather sit with a therapist even if they are biased. Thatâs okay. For the big life decisions, I want a human to weigh in.â
AI: The Emotional Third Place
A recent LinkedIn post from a mental-health founder broke down therapist expenses to explain why charging less than âš1,500 per session is nearly impossible. The math makes sense, but it also makes something else clear: professional mental health support is unaffordable for a vast majority of people who need it.
Some therapists are trying to improve access by offering small group therapy (it costs half as much) and a small number of non-profits and hospitals offer free or low-cost therapy. But waiting lists are long and demand is way more than capacity. So it is important to acknowledge that AI fills a very important need.
If we think of AI not as âtherapy-liteâ but as a new category of emotional support, its role (and impact) become easier to understand. It is more interactive than journaling or meditation; more neutral and available than loved ones; and more accessible than therapy.
And because people donât actively think of it as a mental health aid, it has slowly become a part of our emotional infrastructure, filling gaps that human systems have missed. In our journey towards real healing, it works as first-aid. It occupies a sort of middle-ground, a third place for our emotional lives.
But in a country where mental health has only recently become mainstream (if at all), this third place is not an equal space. If this third place is here to stay, now is the time to invest in building not only access to mental health resources but also Indiaâs psychological literacy.
1 The names of all interviewees have been changed to protect their privacy. All of them are Indian millennials living and working in metros.
2 The World Health Organisation (WHO) defines mental health as âa state of mental well-being that enables people to cope with the stresses of life, realize their abilities, learn and work well, and contribute to their community.â
If you liked this essay, consider sharing with a friend or a colleague that may enjoy it too. (If you share on socials, tag Gowri on Twitter (X) and LinkedIn)







Well penned. Loved how accessible the language is.
AI is helping at the moment and building infrastructure around it by acknowledging is so important and critical right now. The point about access is spot on, very insightful.