Questions Emerge About Mixing ChatGPT With Mental Health


In today’s world, we continue to assess the extent to which various forms of artificial intelligence (AI) can help individuals in their daily lives.

Horror stories, such as the man who killed himself after facing apparent encouragement to do so by an AI chatbot known as “Eliza” on the Chai app, typically go viral.

However, in spite of this, there are still ongoing debates as to whether fine-tuning AI to meet the needs of individuals still exists within the realm of possibilities.

Signup for the USA Herald exclusive Newsletter

ChatGPT continues undergoing examination via this lens.

Blending AI and mental health support

A brand new study by JAMA Network Open reveals that ChatGPT largely struggled with appropriately directing individuals to mental health support when they requested it. ChatGPT only succeeded in this task 22% of the time. Meanwhile, the remaining 78% left much to be desired.

Some within the healthcare and medical communities feel that room for improvement exists. This improvement would have to take place in the form of tweaking the backend systems that determine ChatGPT’s general functionality.