SALT LAKE CITY (KUTV) — Utah teens face a big problem when it comes to intellectual aptitude because there are too few counselors to keep up with the demand.
Could AI help young people in need of behavioral fitness care, or would it only make the situation worse?
A Utah-based company said AI can bridge the gap between counselors and students, but some therapists said the generation has limitations.
MORE on AI: Utah Schools Struggle to Balance AI Detection with False Accusations of Cheating
Mele Tua’one is starting his freshman year at West High School and knows it will be a big adjustment from high school.
She said, “I’m defeated because the school is big. So, I have the impression that it will be difficult to find my way. “
If she was feeling depressed and there were no advisors to communicate with, she would feel comfortable opening up to an AI chatbot.
“I feel like if I don’t have anyone to communicate with, this is a smart position to fall back on,” Mele said.
Her mother, Alexia, said that if there were no advisors available, an AI chatbot was better than nothing.
“If they have nothing, their minds can take them places,” Alexia said.
The company’s CEO, David Barney, said this is the explanation for why the chatbot was created, as there weren’t enough advisors to meet the desires of Utah students.
“If each and every single person currently studying psychology and social painting became licensed therapists, we still wouldn’t be close to meeting the demand,” he said.
If the recommendation came from AI, how can we accept it as true completely?Barney said they don’t rely on the internet to feed their AI information. They depend only on what the doctors tell them.
“I would say it this way. I would say that the recommendation does not come from AI. We have educational models developed through our Clinical Advisory Board of licensed and trained psychologists and therapists. They build the model. We use industry practices,” Barney said. When we work with our clinical advisory board, we tell them, ‘We’re looking to map your brain to Eliza’s. ‘”
Child and family psychologist Douglas Goldsmith said there might be a place for AI in treatment, but not for critical issues.
“If a site said, ‘We can help you with the following problems: I can’t understand my homework. I’m having trouble with math. I don’t know what to do with this child,’ and a written [response] appears,” Oh, you’re being bullied on the school. He talks to your parents. Talk to your manager. Here are some things you can say. That’s good,” Goldsmith said.
However, he said there were many things that couldn’t be detected just by typing his disorders on a chat screen.
“A device that responds to a child would possibly miss a lot of nuances and fail to ask the right questions,” he said.
In addition, the cat would not be able to pick up on the child’s tone, which is that of a counselor.
For example, a child who angrily shouts: “I have no friends!” You might face a very different challenge than a child crying sadly, “I have no friends. “
“This child may just feel lonely and unhappy that day. This child may simply be bullied. This child may also be suicidal,” Goldsmith said.
Barney claimed that the chatbot is meant to update genuine therapy.
KUTV asked if there was a situation where Eliza said, “This is too much for me. ” »
“Of course, we’re not addressing some of those deeper issues to begin with,” Barney said. “Maybe we can provide intellectual support to those who are beaten or lonely. “
The next question is: what happens if the student talks about primary issues, such as drug addiction or abuse? Also, how can the company keep the chat private?
This will be covered in two of our 2News Investigates reports.
___