Can your relationship with an online AI wellbeing consultant improve your mental health?

Can your relationship with an online AI wellbeing consultant improve your mental health?

24 August 2020: by Maria Paviour - founder of Cari - connect with me on LinkedIn

It has been shown in research* that, actually, yes it can be. There can be a time when a computer-mediated approach is actually better than a human.

And this is because when responding to an artificial intelligence / computer consultation, it encourages more self-disclosure and honesty from people than the same experience delivered by a human.

Why would this be the case?

There are number of factors that influence this unexpected response. And they are listed below.

But of course artificial intelligence (AI) is never going to take the place of human interaction. We need the feeling of empathy with another and the emotional connection that we get when we open up and talk from heart.

So for many people the best thing possible is to have someone they can turn to, and share how they feel. We know that if this is combined with genuine self-discovery and self-awareness it improves our mental health and wellbeing.

But it’s not always that way. The time may not be right and people are just more comfortable with being online.

So how do we benefit from interacting with an AI wellbeing consultant?

1. Being seen to disclose and seeing the person to whom I disclose

If I can see you, I am less likely to self-disclose and less likely to be honest about how I am feeling.

If you think about it, people tend to be less likely to give feedback to someone if they can see their face - it's easier to leave a bad review on Google than to speak to someone face to face.

If you can see me, I may worry about how you will respond to me. So the research shows that if we are not being seen by someone we are more likely to explore our feelings and share more.

If I feel you may judge me, I will be less likely to tell you how I truly feel. If I am not anonymous I may hold back and be far less honest.

2. The intrigue factor and confidentiality

When I disclose to a human I may be also disclosing to all their contacts - or at least, to some of them. Whether deliberately, or purely through accidental nuances, my story may break out of confidentiality.

Imagine a situation in which an employee, Georgie, tells her colleague, Paul, that her relationship with her partner has broken down. And then imagine in a team meeting the team leader making an innocuous comment about 'everyone's partners' being invited to the staff party'. Maybe without any malice, Paul might shoot a look towards Georgie and before you know it everyone is drawing conclusions or making assumptions!

3. Honest responses and unbiased feedback

The research suggests this straightforward interaction between the computer/AI and the user is important in enabling a much more honest and helpful response.

The user trusts that the responses from the AI consultant will be honest and unbiased.

As HR & OD teams, we need truly open responses from our people in their wellbeing consultations in order to provide the right support.

We need to gain this input without jeopardising the good relationships and performance of the individual, let alone their reputation with their team, customers and stakeholders.

How Cari can work with your human HR teams

If you need to provide all your team with wellbeing consultations, Cari, an AI Wellbeing Consultant, will help. She provides confidential online, neuroscience based consultations and personalised support plans (free of charge for public and private sector organisations).

She works 24x7x365. She can immediately help everyone within your public and private sector organisations. And she’ll be an ongoing proactive service for highlighting any wellbeing issues early – you could ask your people to meet her online every two months for example. She’s a dream team member in any HR department!

We find that Cari is ideal for breaking down that barrier between mental health silence and people being able to start talking about their mental health. In a recent large scale public sector research project we carried out, Cari resulted in the number of people talking about mental health at work rising by 160%.

Having said that, humans are the best! This is why we also train people to become NeuChem coaches, because, even with a science based approach, the most important aspect is the relationship between the coach and the individual.

When people are ready to talk and/or are advised by Cari that they need further support, then humans are the ones they need to connect with.

But when there are many people that are not ready to speak out about how they feel yet - many of whom we are discovering are 'presentees' at work - computer mediated support, such as Cari, can be a life line.

AI enables people to make that all important first step into self-disclosure without fear of judgement. It's a difficult thing to reveal vulnerability, but it’s an essential first step. AI can help to ease the path.

I’d love to hear your thoughts…

Why not have a free wellbeing consultation with Cari yourself?

*Reference - Self‐disclosure in computer‐mediated communication: The role of self‐awareness and visual anonymity By Adam N Joinson in the European Journal of Social Psychology.

24 August 2020: by Maria Paviour - founder of Cari - connect with me on LinkedIn

What do you think? Leave a comment

We'd love to hear from you