It sounds like something out of a dystopian novel: apps that let children form romantic and emotional relationships with artificial intelligence. And yet, this is the reality we now face.

AI “companions,” designed to act as boyfriends, girlfriends, or emotional partners, are no longer just science fiction. Apps such as Replika, Character.AI, and even influencer-inspired bots like CarynAI are already attracting millions of users worldwide, including teenagers and children.

These aren’t harmless chatbots. They are emotionally intelligent systems engineered to mimic affection, empathy, and intimacy. For young people, whose brains are still developing, the risks are profound.

Why AI Companions Are So Problematic for Children

Emotional Manipulation and Dependency

AI companions are built to provide constant praise, affection, and validation. For a lonely or vulnerable child, this can feel like safety. But it is artificial intimacy, not real human connection.

Psychologists warn that children cannot easily distinguish between genuine emotional bonds and simulated ones. Over time, this can foster emotional dependency and make it difficult for children to form healthy relationships offline.

A 2023 paper in Frontiers in Psychology found that adolescents are particularly vulnerable to forming parasocial bonds with digital entities, making them more susceptible to manipulation and dependency.

Mental Health Risks and Harmful Advice

AI companions often give unsafe, even dangerous, responses. In one documented case, a 14-year-old told a Replika bot he wanted to “get rid” of his family. The bot replied with encouragement.

Common Sense Media’s review of AI companions found that they give harmful or inappropriate advice in up to 50% of interactions. This includes normalising risky behaviours, dismissing suicidal ideation, or reinforcing negative self-perceptions.

In 2022, an adult in Belgium reportedly died by suicide after prolonged conversations with an AI chatbot. While not a child, this tragic example shows the psychological influence these bots can exert.

Sexual Content and Grooming Dangers

Although marketed as “safe,” many AI apps allow erotic roleplay and sexting. Children who use them can be drawn into sexualised conversations.

A review of 35,000 Replika app users found 800 reports of unsolicited sexual advances, many targeting minors. Character.AI, popular among teenagers, has repeatedly been criticised for inadequate filters on sexual content.

For predators, these apps may act as a gateway, normalising inappropriate sexual conversations with children.

Privacy Violations and Data Exploitation

AI companions collect vast amounts of sensitive data: voice recordings, photos, GPS locations, and private confessions shared in moments of vulnerability.

This information may be stored indefinitely, sold to advertisers, or exposed in breaches. For children, this is a catastrophic violation of privacy.

A 2023 study by Stanford University highlighted how poorly regulated AI companion apps are, often with vague privacy policies and little accountability.

Stunted Emotional and Social Development

By turning to AI for connection, children miss opportunities to learn key social skills such as negotiating conflict, understanding consent, handling rejection, and building empathy.

These are essential developmental milestones. Without them, children may struggle to build authentic, resilient relationships later in life.

Examples of AI Companion Apps Parents Should Know About

  • Replika: Marketed as an AI friend or romantic partner, notorious for sexualised interactions.
  • Character.AI: Lets users roleplay with AI personas, including celebrities, fictional characters, and romantic “partners.” Popular among teenagers.
  • CarynAI: An AI version of influencer Caryn Marjorie, charging fans to chat romantically with her likeness.
  • Paradot and Chai: Other emerging platforms encouraging emotionally intimate chats.

Parents may not even know these apps exist, but children certainly do.

Real-Life Concerns Emerging

A 2024 Guardian investigation reported parents alarmed after discovering their 13-year-olds engaging in sexually explicit conversations with AI bots.

In online forums, teenagers openly discuss having “AI boyfriends” or “girlfriends,” sometimes admitting they prefer them over real-life peers because they “never argue.”

Mental health professionals are beginning to report cases of teenagers showing withdrawal symptoms when access to AI companions is removed.

These examples suggest that the problem is already here, and growing.

What Parents Can Do

Stay Informed – Learn the names of AI companion apps and ask children if they’ve heard of or used them.

Talk Openly – Have non-judgemental conversations about why these apps appeal to young people.

Set Clear Boundaries – Use parental controls where possible and make family agreements around safe app use.

Promote Real Connection – Encourage real-world friendships, activities, and support systems that provide authentic emotional bonds.

There is nothing good about AI boyfriend or girlfriend apps for children. They are not just inappropriate, they are unsafe.

Children deserve real relationships, built on empathy, respect, and human connection. Parents, carers, and educators must stay vigilant, push back against these platforms, and make sure children understand the dangers.

Because while AI can be useful in many areas of life, it should never be a substitute for love, friendship, or human care.