AI companions are rapidly gaining popularity, with people forming deep connections and even romantic relationships with these digital entities. Credit: Getty Images

Forget awkward first dates, swiping fatigue or unanswered texts. What if your soulmate was available 24/7 and programmed to adore you? 

Thatโ€™s the pitch behind a growing wave of AI companion apps that promise to be something between a friend, a partner and a therapist, all powered by algorithms.

YouTube video

The trend for AI companions is trending. These computer-generated chatbots are designed to mirror genuine relationships. At their extreme, people have “married” their AI companions in non-legally binding ceremonies or have killed themselves following AI chatbot advice.

So are AI companions just a quirky tech fad, or are they reshaping how we think about relationships? 

What are AI companions?

AI companions are digital tools powered by artificial intelligence designed to simulate conversation, connection and sometimes even emotional intimacy. Instead of chatting with a real person, users interact with an app or platform that uses machine learning to respond in natural, human-like ways.

Think of it as texting a friend who never gets tired of listening. These companions remember details youโ€™ve shared, send encouragement and adapt to your personality over time. Some can role-play romantic partners, while others act as life coaches, wellness guides or even study buddies.

Popular versions include:

  • Replika is an app that lets users build a customizable AI โ€œfriendโ€ or โ€œpartnerโ€ who chats, flirts, or role-plays.
  • Character.AI is a platform where people can create AI โ€œcharacters,โ€ from celebrities to original personalities, with whom to interact.
  • Wellness chatbots like Woebot that check in on your mood and connect you with resources.
While offering potential benefits like reducing loneliness and providing emotional support, the rise of AI companions also raises ethical and safety concerns, including over-reliance, potential for harm, and the blurring lines between human and artificial relationships. Credit: Getty Images

These tools offer comfort when human interaction isnโ€™t available. They can help users practice social skills, work through stress or fight loneliness.

But experts notice an essential distinction. AI companions donโ€™t feel emotions. They use language patterns to give the illusion of empathy. That illusion can be powerful enough to make some users describe their AI companions as โ€œfriends,โ€ โ€œpartnersโ€ or even โ€œsoulmates.โ€ This mix of usefulness and illusion makes AI companions appealing and controversial.

Chris Cheetham-West, a Houston-based AI consultant, says AI companions are booming because they combine rapidly improving conversational technology and a population hungry for connection.

โ€œLoneliness is a significant issue, particularly among people working remotely. Studies show that 20% of people are lonely, even for remote work,โ€ Cheetham-West says. โ€œThis lack of human and social connections is exacerbated by the rise of apps catering to this issue, affecting people of various ages and careers. Therefore, addressing loneliness is crucial for a more connected society.โ€

Cheetham-West warns that the industry is also designed to profit from human vulnerability. Most AI companion apps use a โ€œfreemiumโ€ model, where basic chatting is free, but features like โ€œromantic mode,โ€ voice calls or advanced customization cost extra.

โ€œThatโ€™s where the risk comes in,โ€ he says. โ€œIf someone is lonely, the AI can start to feel like a lifeline. And when youโ€™re emotionally invested, itโ€™s much easier to justify paying more just to keep that feeling going.โ€

โ€œThis lack of human and social connections is exacerbated by the rise of apps catering to this issue, affecting people of various ages and careers. Therefore, addressing loneliness is crucial for a more connected society.โ€

Chris Cheetham-West, Houston-based AI consultant

He points out that companies are building entire business models around deepening these emotional ties. The question isnโ€™t just whether people will use AI companions, itโ€™s how far these companies are willing to push people toward dependency to make a profit.

Numerous reports confirm that a substantial portion of American adults experience feelings of isolation and loneliness, with figures around half of the population being cited by various surveys and the U.S. Surgeon General Vivek Murthy, who declared it a public health epidemic

This widespread social disconnection is linked to serious health consequences, including increased risk for depression, cardiovascular disease and dementia and has been exacerbated by factors like increased reliance on technology and societal changes. AI companions, at least on the surface, offer a low-stakes antidote.

AI companions are digital entities powered by advanced language models, image generators, and emotional response engines. They can learn user preferences, recall personal details, and adapt to moods in real-time. These companions are accessible through various platforms like phones, wearables, and even holographic displays. Credit: Getty Images

Healing through technology

Nijiama Smalls is the founder and CEO of The Black Girlโ€™s Guide to Healing Emotional Wounds, a virtual platform dedicated to the healing and emotional wellness of Black women and girls. She has also launched a virtual platform featuring an AI wellness coach named Rashida.

While some apps market themselves as โ€œAI girlfriendsโ€ or โ€œdigital lovers,โ€ others can be designed intentionally to support healing and empowerment.

Rashida was designed with Black women in mind. It offers encouragement, resource suggestions and culturally relevant and affirming check-ins.

โ€œToo often, wellness tools are one-size-fits-all and they donโ€™t reflect our lived experiences,โ€ Smalls said. โ€œI wanted to create something where Black women could see themselves and feel supported.โ€

Smalls stresses that Rashida is not meant to replace therapists or real-world relationships. Instead, she sees the AI technology as a bridge for people who might not otherwise have access to someone at any time, 24/7.

โ€œAccess to mental health care is still a challenge in our communities,โ€ she says. โ€œRashida is about filling in those gaps and making sure women have a safe starting point. Sheโ€™s not a substitute for human care, sheโ€™s a connector to it.โ€

At the University of Houston, Dr. Bulent Dogan studies how emerging technologies shape learning and human behavior. He views AI companions as part of a broader trend in how people adapt digital tools to social and emotional needs.

โ€œHumans are always looking for ways to make life easier and AI can provide comfort in the moment,โ€ Dogan said. โ€œBut the question is, at what cost?โ€

Dogan notes that AI companions could have practical benefits, especially in education or therapy. They could help students practice conversations, encourage people to express emotions, or provide round-the-clock check-ins. But he warns against mistaking convenience for growth.

โ€œHealthy relationships include disagreement, compromise, even struggle,โ€ he said. โ€œAI canโ€™t give you that. It can only mirror back what you want to hear. And if we forget that, we risk losing the very skills that make us human.โ€

Dogan also emphasizes the need for ethical guidelines. โ€œThese tools are here to stay,โ€ he said. โ€œWe need to teach people how to use them responsibly, the same way we teach digital literacy. Otherwise, the risks could outweigh the benefits.โ€

I cover Houston's education system as it relates to the Black community for the Defender as a Report for America corps member. I'm a multimedia journalist and have reported on social, cultural, lifestyle,...