Mental health helpline ends data-sharing relationship

  • Published
stock photo of an upset woman making a phone callImage source, Getty Images

A mental health text helpline will no longer share data with an AI customer support business, after criticism of their relationship.

Loris.ai used anonymised data from Crisis Text Line (CTL) to develop AI systems to help customer service agents understand sentiment in chats.

One CTL board member tweeted they had been "wrong" to agree to the relationship.

The helpline said it had listened to its community's concerns.

A similar UK helpline uses CTL software, but has been assured its data was not passed to Loris.

CTL, which is based in the US, told the BBC it ended the data-sharing relationship with Loris on 31 January, and had asked for its data to be deleted.

Its vice-president Shawn Rodriguez told the BBC: "Loris has not accessed any data since the beginning of 2020."

The use of the data gained wide attention after a report by Politico., external

In August, Loris said its "AI-powered chat solution" helped customer service representatives, "understand the customer's sentiment and dynamically crafting effective responses based on the customer's tone".

Loris added that its origins "come from the highly acclaimed Crisis Text Line, where the ability to handle the toughest conversations is critical. It is this intelligence, drawn from analysing nearly 200 million messages, that sits at the core of the AI".

The company told Politico its AI had since "evolved" to include e-commerce and other data from businesses.

CTL says any data shared was fully anonymised and scrubbed of personally-identifiable information.

The helpline says it is transparent about data sharing with its users - its terms of service and privacy policies are sent "as an automatic auto-reply to every initial text message that Crisis Text Line receives, and to which all texters consent", it said.

But Politico spoke to experts strongly critical of the arrangement. One questioned if those experiencing a mental health crisis could fully consent to data-sharing.

CTL "may have legal consent, but do they have actual meaningful, emotional, fully understood consent?" Jennifer King, privacy and data policy fellow from Stanford University's AI Institute, told Politico.

Some involved in CTL now also question the arrangement with Loris.

In a tweet after the article's publication board member Danah Boyd said that on reflection: "I was wrong when I agreed to this relationship." She also wrote a blog post setting out in detail her thoughts on the issue., external

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by danah boyd

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by danah boyd

CTL has now ended data-sharing. It said in a statement on its website, external it had "listened closely to our community's concerns".

It said it was making changes and working to increase transparency for users: "We heard your feedback that it should be clear and easy for anyone in crisis to understand what they are consenting to when they reach out for help," it said.

The helpline says it has had more than 6.7 million conversations with people in need of help.

And Mr Rodriguez wrote that "the power of data and artificial intelligence" remained central to CTL's work assisting those in need of mental health support.

Data was used to identify those at risk and get them help as fast as possible, he said.

"And, data is used successfully to de-escalate tens of thousands of texters in crisis experiencing suicidal ideation," he added.

UK data

CTL's technology is used by other organisations including UK-based Shout 85258,, external which offers a free text messaging mental health support service.

The charity recently announced it has passed one million conversations with people needing support.

It told the BBC that only thoroughly anonymised data was shared with CTL, and that its data - which is held on secure servers in London - had not been given to Loris.

Shout says "a strict data transfer agreement with CTL" only allows the US hotline to use the data to provide and improve a "technology platform for delivery of the Shout service" which the charity licenses on a pro-bono basis.

CTL also told the BBC: "No UK users were affected by the sharing of our anonymised US data with Loris".