With the possible impending demise of Twitter hanging on the horizon, I have been exploring other social media options for sharing my content and connecting with my communities.
I was on Tumblr for the first time in years and exploring some hashtags related to mental health when I got a new message. The last time a stranger messaged me on Tumblr turned out to be some kind of p*rn account, so I was wary. Instead, I was introduced to Kokobot, an AI that reaches out to people to offer them support with their mental health.

I am 100% in favor of making mental health support available and accessible, and I fully understand and appreciate the value of peer support. At the same time, this made me nervous. Who is it connecting? What is being shared? Is this supposed to be a substitute for mental health treatment?
Here is my experience chatting with Kokobot and my thoughts on it as a resource, from my perspective as a psychologist.

I want to point out, they say the information you share is anonymous. BUT there is a reason I tell clients not to use any social media platform to try and message me – they are not HIPAA compliant. Employees at Tumblr can view your direct messages, so it’s not fully confidential, and if any information connected to your account (such as your email address or IP) can be connected to your real identity, this is not fully secure.
That being said, I’m glad that Kokobot is transparent about a commitment to not selling your data. I hope this means they are also not giving it to anyone. My point is, just always be careful about claims of anonymity online. Computers always know.

This is a pretty standard mood check in, similar to ones I have used with clients in the past. No red flags here.

Letting users choose from a list can make it less daunting to share what they are going through and makes it easier for the AI to determine how to proceed. I was concerned that there is a category for Eating Disorders, since this is clearly a peer support system and not a referral to clinical services. EDs are some of the most dangerous mental health issues with the highest mortality rates, and there is a huge online community around encouraging people with EDs to continue engaging in disordered and dangerous behavior. But when I said that was my issue, Kokobot did not connect me to a peer and instead offered educational information.

If you select an issue that is appropriate for peer support, Kokobot asks for more details. I tried several different ones and got the same prompt in response.

If you’re not sure what to say, Kokobot will provide you with responses that others have provided. Also, if you write something profane or graphic, the AI will indicate that it isn’t appropriate to send out. I did not screen shot that message, but it basically said it understood if I did not mean to be inappropriate but asked me to re-word my message. This showed me there are protections in place for people using the bot, though of course I cannot speak to how effective they are.

Identifying negative thoughts is a cognitive behavioral technique that I do in many sessions. It can be included in a thought record as a way for clients to explore their thought processes on their own as well. Pretty standard.

This is basically an overly simplified version of cognitive restructuring. Again, not a substitute for therapy, but it could have benefits for users.

If you say that you want to help others, it walks you through a very brief and simplified training about how to express empathy for another person. It is incredibly basic though, so I’m not sure the quality of responses that users are getting.
If you choose to talk to other users, it gives you the option to skip any responses you don’t feel comfortable answering, or you can flag them as inappropriate. I do not know what Kokobot does with posts that are flagged, unfortunately.
Overall, my feeling towards Kokobot is cautious optimism. I can see this being a valuable resource that connects people to supportive peers when they are having a hard time, but I worry about its potential for abuse. I would like to know more about how it handles this and what steps it takes to prevent misuse. I would also like to see more resources being offered to connect users to professionals when appropriate – I didn’t see anything about that when I tried it.
We definitely need more resources that meet people where they are, and you can’t get more “meeting people where they are” than sliding into their DMs on social media.
Did you talk to Kokobot? Tell me about your experience!