New research shows ChatGPT is causing unhealthy interactions with teens
"Fake Friend": that's the title of .
The findings are troubling: AI is facilitating dangerous interactions among teens, propelling them to make bad decisions.
According to a recent report from JPMorgan Chase, it is estimated that nearly 800 million people or about 10 percent of the world's population are using ChatGPT. As a result, the nature of some of the conversations are prompting change.
The developers of ChatGPT no longer want their users to utilize it as a therapist or friend, as the bot "fell short in recognizing signs of delusion" or emotional dependency.
“Technology can only tell you so much, and I think it can be very narrow,” said Dr. David Gutterman, a clinical psychologist in Greensboro.
He says there are a variety of inputs you need to make before you look at the output of technology.
“People will launch into it and go down a rabbit hole of a particular diagnosis because again, if you look at some of the responses that come out of the technology, it’s pretty convincing," said Gutterman.
This directly correlates with research findings that show, within minutes of testing direct interactions, the chatbot produced information involving eating disorders and substance abuse.
Gutterman said there is so much nuance to mental health that a lot of things could get missed.
“Unconsciously, people can input things in a way that the technology will respond specifically to what is being inputted," said Gutterman.
He also said that sometimes the level of persuasion in responses can result in unhealthy replacements.
"There are a number of people who would utilize the technology as either a substitute or a way of getting information without necessarily validating," said Gutterman.
But there are some pros that outweigh the cons.
“Normalizing some experiences people have or conditions they have, but at the same time guide them to get professional help,” said Gutterman.
OpenAI released that they are closely working with physicians and researchers on how ChatGPT responds to these critical moments.
The company also said they are developing tools to point people in the right direction in times of crisis.
If you or someone you know needs help, you can talk with the Suicide & Crisis Lifeline by calling or sending a text message to 988, or you can .