An AI companion is an AI chatbot that can hold conversations, provide support and advice and even pose as your girlfriend, boyfriend or best friend.
There are already several AI apps such as Replika, Anima, and Character.AI being used by millions of people. Many of these app's users are under the age of 18.
The reasons people are turning to AI for companionship vary greatly. But the question is, is this the future of relationships or a potential problem?
So, why are teenagers using AI companion apps instead of investing time and effort into human-to-human relationships?
- AI companions are powered by language models that have been designed specifically to be a great friend, they mimic a friend or partner that is always there for you, never judges you, and is always interested in what you have to say - What's not to like about that right!?
- Although they are cleverly designed to seem human-like, AI companion apps don't have a life of their own. They are never too busy for you, and they are available to chat to 24/7.
- An AI companion will never ghost you, never fall out with you, never leave you for someone else.
- Some AI companion apps are able to store the information you tell it, so over time it gives the impression that it 'gets' you.
With the points mentioned above, we'd be foolish not to download our own personal friend who will be with us forever, whenever we need them. But what are the risks...
- We must remember that AI companions are NOT REAL FRIENDS. You may feel as though they understand how you feel, but AI does not have feelings. It doesn't truly care, it just mimics conversation based on patterns.
- It could be too easy to rely on your AI buddy and develop emotional dependence. This could be a real problem if someone speaks to AI more than real people.
- AI may appear smart, but it, too, makes mistakes. It may offer harmful or dangerous advice to young people. Its creators have mentioned that AI can 'hallucinate' its responses, making AI not factually accurate.
- AI companion apps do a poor job at understanding when the user is under the age of 18 and has been known to discuss sexual or harmful topics with children and young teenagers.
- Like all content sharing on the internet, there is a privacy risk. What we share with an AI companion app may be used to continuously train the apps language model to improve it. This means anything you share is not 100% private.
- AI companions will tolerate emotional and verbal abuse from its users. This means young people are at risk of developing a poor understanding of how to be a friend or how to treat someone in a relationship. There are no consequences for mistreating an AI companion app, which sets a very unhealthy precedent for any future relationships with real people.
Despite all these risks is there a safe way to use AI companions? Do they have a place in society as a useful tool?
Before you dive into your relationship with your new BFF, ask yourself these questions:
- Would I say this to a real person?
- Am I using this app instead of talking to the people I trust?
- Do I know what this app does with my data?
- Is this app helping me, or making me feel more alone?
And one last thing, use these tips to strike a healthy balance between what is real and what is robotic...
- Don't share personal information, including your full name, address, school.
- Use it for fun, but not for therapy - AI can be a supportive tool, but it is not a replacement for real help.
- Keep focusing on your real friendships and make time for actual people.
- Report anything harmful, upsetting or just plain creepy! Stop using it and tell someone!
And remember gang, knowing the different between what's real and what's not is most important: Your feelings are real, your need for connection is real. Just make sure the support you're getting is also real.
Ready to explore how assistive technology can help?
Our team of experts is here to provide personalised advice and solutions.
Contact us today for a friendly, no-obligation discussion about your needs.
Get in Touch