Tuesday, 26 Nov 2024

Mental health app faces backlash for testing ChatGPT to counsel 4000 users

AI chatbots could slowly be taking the place of lawyers or even yourself but can they also take the place of your therapist?

Last week, Rob Morris, the co-founder of mental health app Koko, wrote on Twitter that his app used AI chatbot, GPT-3, to counsel 4000 people.

Morris explained how they used a ‘co-pilot’ approach, with humans supervising the AI as needed. They did this on about 30,000 messages.

The platform found that the messages composed by the AI were rated significantly higher than those written by humans while response times went down 50% to under a minute.

Despite the success among users, the test was shut down because it sounded ‘inauthentic’.

This statement was misunderstood by Twitter users to mean that users did not know that they were talking to a chatbot.

Morris clarified to Gizmodo that the ‘people’ referred to in the tweet were himself and his team, not unwitting users.

In fact, Koko users knew the messages were co-written by a bot, and they weren’t chatting directly with the AI.

When AI was involved, the responses included a disclaimer that the message was ‘written in collaboration with Koko Bot’.

Some important clarification on my recent tweet thread:

We were not pairing people up to chat with GPT-3, without their knowledge. (in retrospect, I could have worded my first tweet to better reflect this).

Morris admits the shortcomings of using AI to provide empathy as ‘machines don’t have lived, human experience’.

‘So when they say “that sounds hard” or “I understand”, it sounds inauthentic,’ said Morris.

‘It’s also possible that genuine empathy is one thing we humans can prize as uniquely our own. Maybe it’s the one thing we do that AI can’t ever replace,’

The experiment raises ethical questions about the risk of testing unproven technology on vulnerable users but Koko is hardly the first company to try and use AI to take the place of humans.

DoNotPay is behind the world’s first ‘robot lawyer’ and has even come up with a tool that can speak to banks’ customer support using an AI-generated version of people’s own voices.

Source: Read Full Article

Related Posts